datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
scan | ---
annotations_creators:
- no-annotation
language_creators:
- expert-generated
language:
- en
license:
- bsd
multilinguality:
- monolingual
pretty_name: SCAN
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text2text-generation
task_ids: []
paperswithcode_id: scan
tags:
- multi-turn
dataset_info:
- config_name: simple
features:
- name: commands
dtype: string
- name: actions
dtype: string
splits:
- name: train
num_bytes: 3217770
num_examples: 16728
- name: test
num_bytes: 799912
num_examples: 4182
download_size: 4080388
dataset_size: 4017682
- config_name: addprim_jump
features:
- name: commands
dtype: string
- name: actions
dtype: string
splits:
- name: train
num_bytes: 2535625
num_examples: 14670
- name: test
num_bytes: 1508445
num_examples: 7706
download_size: 4111174
dataset_size: 4044070
- config_name: addprim_turn_left
features:
- name: commands
dtype: string
- name: actions
dtype: string
splits:
- name: train
num_bytes: 3908891
num_examples: 21890
- name: test
num_bytes: 170063
num_examples: 1208
download_size: 4148216
dataset_size: 4078954
- config_name: filler_num0
features:
- name: commands
dtype: string
- name: actions
dtype: string
splits:
- name: train
num_bytes: 2513034
num_examples: 15225
- name: test
num_bytes: 330087
num_examples: 1173
download_size: 2892291
dataset_size: 2843121
- config_name: filler_num1
features:
- name: commands
dtype: string
- name: actions
dtype: string
splits:
- name: train
num_bytes: 2802865
num_examples: 16290
- name: test
num_bytes: 330087
num_examples: 1173
download_size: 3185317
dataset_size: 3132952
- config_name: filler_num2
features:
- name: commands
dtype: string
- name: actions
dtype: string
splits:
- name: train
num_bytes: 3106220
num_examples: 17391
- name: test
num_bytes: 330087
num_examples: 1173
download_size: 3491975
dataset_size: 3436307
- config_name: filler_num3
features:
- name: commands
dtype: string
- name: actions
dtype: string
splits:
- name: train
num_bytes: 3412704
num_examples: 18528
- name: test
num_bytes: 330087
num_examples: 1173
download_size: 3801870
dataset_size: 3742791
- config_name: length
features:
- name: commands
dtype: string
- name: actions
dtype: string
splits:
- name: train
num_bytes: 2672464
num_examples: 16990
- name: test
num_bytes: 1345218
num_examples: 3920
download_size: 4080388
dataset_size: 4017682
- config_name: template_around_right
features:
- name: commands
dtype: string
- name: actions
dtype: string
splits:
- name: train
num_bytes: 2513034
num_examples: 15225
- name: test
num_bytes: 1229757
num_examples: 4476
download_size: 3801870
dataset_size: 3742791
- config_name: template_jump_around_right
features:
- name: commands
dtype: string
- name: actions
dtype: string
splits:
- name: train
num_bytes: 3412704
num_examples: 18528
- name: test
num_bytes: 330087
num_examples: 1173
download_size: 3801870
dataset_size: 3742791
- config_name: template_opposite_right
features:
- name: commands
dtype: string
- name: actions
dtype: string
splits:
- name: train
num_bytes: 2944398
num_examples: 15225
- name: test
num_bytes: 857943
num_examples: 4476
download_size: 3861420
dataset_size: 3802341
- config_name: template_right
features:
- name: commands
dtype: string
- name: actions
dtype: string
splits:
- name: train
num_bytes: 3127623
num_examples: 15225
- name: test
num_bytes: 716403
num_examples: 4476
download_size: 3903105
dataset_size: 3844026
config_names:
- addprim_jump
- addprim_turn_left
- filler_num0
- filler_num1
- filler_num2
- filler_num3
- length
- simple
- template_around_right
- template_jump_around_right
- template_opposite_right
- template_right
---
# Dataset Card for "scan"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://github.com/brendenlake/SCAN](https://github.com/brendenlake/SCAN)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 224.18 MB
- **Size of the generated dataset:** 44.53 MB
- **Total amount of disk used:** 268.71 MB
### Dataset Summary
SCAN tasks with various splits.
SCAN is a set of simple language-driven navigation tasks for studying
compositional learning and zero-shot generalization.
See https://github.com/brendenlake/SCAN for a description of the splits.
Example usage:
data = datasets.load_dataset('scan/length')
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### addprim_jump
- **Size of downloaded dataset files:** 18.69 MB
- **Size of the generated dataset:** 4.05 MB
- **Total amount of disk used:** 22.73 MB
An example of 'train' looks as follows.
```
```
#### addprim_turn_left
- **Size of downloaded dataset files:** 18.69 MB
- **Size of the generated dataset:** 4.09 MB
- **Total amount of disk used:** 22.76 MB
An example of 'train' looks as follows.
```
```
#### filler_num0
- **Size of downloaded dataset files:** 18.69 MB
- **Size of the generated dataset:** 2.85 MB
- **Total amount of disk used:** 21.53 MB
An example of 'train' looks as follows.
```
```
#### filler_num1
- **Size of downloaded dataset files:** 18.69 MB
- **Size of the generated dataset:** 3.14 MB
- **Total amount of disk used:** 21.82 MB
An example of 'train' looks as follows.
```
```
#### filler_num2
- **Size of downloaded dataset files:** 18.69 MB
- **Size of the generated dataset:** 3.44 MB
- **Total amount of disk used:** 22.12 MB
An example of 'train' looks as follows.
```
```
### Data Fields
The data fields are the same among all splits.
#### addprim_jump
- `commands`: a `string` feature.
- `actions`: a `string` feature.
#### addprim_turn_left
- `commands`: a `string` feature.
- `actions`: a `string` feature.
#### filler_num0
- `commands`: a `string` feature.
- `actions`: a `string` feature.
#### filler_num1
- `commands`: a `string` feature.
- `actions`: a `string` feature.
#### filler_num2
- `commands`: a `string` feature.
- `actions`: a `string` feature.
### Data Splits
| name |train|test|
|-----------------|----:|---:|
|addprim_jump |14670|7706|
|addprim_turn_left|21890|1208|
|filler_num0 |15225|1173|
|filler_num1 |16290|1173|
|filler_num2 |17391|1173|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@inproceedings{Lake2018GeneralizationWS,
title={Generalization without Systematicity: On the Compositional Skills of
Sequence-to-Sequence Recurrent Networks},
author={Brenden M. Lake and Marco Baroni},
booktitle={ICML},
year={2018},
url={https://arxiv.org/pdf/1711.00350.pdf},
}
```
### Contributions
Thanks to [@lewtun](https://github.com/lewtun), [@patrickvonplaten](https://github.com/patrickvonplaten), [@mariamabarham](https://github.com/mariamabarham), [@thomwolf](https://github.com/thomwolf) for adding this dataset. |
Joragasy/CultureNuc_ft | ---
license: mit
---
|
wybxc/books | ---
license: odc-by
task_categories:
- text-generation
language:
- zh
size_categories:
- 100K<n<1M
---
从小说以及其他来源提取的单/多轮对话语料。 |
proculation/mytestds2 | ---
dataset_info:
features:
- name: id
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: is_impossible
dtype: bool
- name: answers
struct:
- name: answer_start
dtype: int32
- name: text
dtype: string
splits:
- name: train
num_bytes: 1092344
num_examples: 721
download_size: 147635
dataset_size: 1092344
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gauss314/opciones | ---
license: apache-2.0
task_categories:
- tabular-classification
- tabular-regression
tags:
- Merval
- options
pretty_name: Merval historical options data, for deep learning and machine learning tests
---
# Downloading this Options Dataset
This document will guide you through the steps to download the Merval options dataset from Hugging Face Datasets.
To start, you'll need to install Hugging Face's `datasets` library if you haven't done so already.
You can do this using the following pip command:
```python
!pip install datasets
```
Here's the Python code to load the Merval equity dataset from Hugging Face Datasets and convert it into a pandas DataFrame:
```python
from datasets import load_dataset
import pandas as pd
id = "gauss314/opciones"
data = load_dataset(id)
df = pd.DataFrame(data['train'][:])
``` |
dolo650/alpaca_1k | ---
license: apache-2.0
---
|
EleutherAI/cifarnet | ---
dataset_info:
features:
- name: img
dtype: image
- name: label
dtype:
class_label:
names:
'0': airplane
'1': automobile
'2': bird
'3': cat
'4': deer
'5': dog
'6': frog
'7': horse
'8': ship
'9': truck
splits:
- name: train
num_bytes: 1560708615.0
num_examples: 190000
- name: test
num_bytes: 82238790.0
num_examples: 10000
download_size: 1642628895
dataset_size: 1642947405.0
---
CIFARNet contains 200K images sampled from ImageNet-21K (Winter 2019 release), resized to 64x64, using coarse-grained labels that roughly match those of CIFAR-10. The exact ImageNet synsets used were:
```
{
"n02691156": 0, # airplane
"n02958343": 1, # automobile
"n01503061": 2, # bird
"n02121620": 3, # cat
"n02430045": 4, # deer
"n02083346": 5, # dog
"n01639765": 6, # frog
"n02374451": 7, # horse
"n04194289": 8, # ship
"n04490091": 9, # truck
}
```
The classes are balanced, and the dataset is pre-split into a training set of 190K images and a validation set of 10K images. |
tyzhu/find_marker_both_sent_train_400_eval_40_last_permute | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 5973002.880726015
num_examples: 4188
- name: validation
num_bytes: 220570
num_examples: 200
download_size: 1111218
dataset_size: 6193572.880726015
---
# Dataset Card for "find_marker_both_sent_train_400_eval_40_last_permute"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
korean-corpus/korquad_v1.0_namu_candidates_256 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
sequence: string
- name: answers
list:
- name: answer_start
sequence: int64
- name: id
dtype: string
- name: origin_answer_start
sequence: int64
- name: origin_text
sequence: string
- name: text
sequence: string
- name: similar_context
sequence: string
splits:
- name: train
num_bytes: 46928632
num_examples: 9606
- name: validation
num_bytes: 4662215
num_examples: 960
download_size: 27292916
dataset_size: 51590847
---
# Dataset Card for "korquad_v1.0_namu_candidates_256"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Adeptschneider/swahili_common_voice | ---
license: apache-2.0
---
|
autoevaluate/autoeval-staging-eval-project-be45ecbd-7284772 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cnn_dailymail
eval_info:
task: summarization
model: patrickvonplaten/bert2bert_cnn_daily_mail
dataset_name: cnn_dailymail
dataset_config: 3.0.0
dataset_split: test
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: patrickvonplaten/bert2bert_cnn_daily_mail
* Dataset: cnn_dailymail
To run new evaluation jobs, visit Hugging Face's [automatic evaluation service](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
AdaptLLM/law-tasks | ---
configs:
- config_name: SCOTUS
data_files:
- split: test
path: scotus/test.json
- config_name: CaseHOLD
data_files:
- split: test
path: case_hold/test.json
- config_name: UNFAIR_ToS
data_files:
- split: test
path: unfair_tos/test.json
task_categories:
- text-classification
- question-answering
- zero-shot-classification
language:
- en
tags:
- legal
---
# Domain Adaptation of Large Language Models
This repo contains the **evaluation datasets** for our **ICLR 2024** paper [Adapting Large Language Models via Reading Comprehension](https://huggingface.co/papers/2309.09530).
We explore **continued pre-training on domain-specific corpora** for large language models. While this approach enriches LLMs with domain knowledge, it significantly hurts their prompting ability for question answering. Inspired by human learning via reading comprehension, we propose a simple method to **transform large-scale pre-training corpora into reading comprehension texts**, consistently improving prompting performance across tasks in biomedicine, finance, and law domains. **Our 7B model competes with much larger domain-specific models like BloombergGPT-50B**.
### 🤗 We are currently working hard on developing models across different domains, scales and architectures! Please stay tuned! 🤗
**************************** **Updates** ****************************
* 2024/4/2: Released the raw data splits (train and test) of all the evaluation datasets
* 2024/1/16: 🎉 Our [research paper](https://huggingface.co/papers/2309.09530) has been accepted by ICLR 2024!!!🎉
* 2023/12/19: Released our [13B base models](https://huggingface.co/AdaptLLM/law-LLM-13B) developed from LLaMA-1-13B.
* 2023/12/8: Released our [chat models](https://huggingface.co/AdaptLLM/law-chat) developed from LLaMA-2-Chat-7B.
* 2023/9/18: Released our [paper](https://huggingface.co/papers/2309.09530), [code](https://github.com/microsoft/LMOps), [data](https://huggingface.co/datasets/AdaptLLM/law-tasks), and [base models](https://huggingface.co/AdaptLLM/law-LLM) developed from LLaMA-1-7B.
## Domain-Specific LLaMA-1
### LLaMA-1-7B
In our paper, we develop three domain-specific models from LLaMA-1-7B, which are also available in Huggingface: [Biomedicine-LLM](https://huggingface.co/AdaptLLM/medicine-LLM), [Finance-LLM](https://huggingface.co/AdaptLLM/finance-LLM) and [Law-LLM](https://huggingface.co/AdaptLLM/law-LLM), the performances of our AdaptLLM compared to other domain-specific LLMs are:
<p align='center'>
<img src="https://cdn-uploads.huggingface.co/production/uploads/650801ced5578ef7e20b33d4/6efPwitFgy-pLTzvccdcP.png" width="700">
</p>
### LLaMA-1-13B
Moreover, we scale up our base model to LLaMA-1-13B to see if **our method is similarly effective for larger-scale models**, and the results are consistently positive too: [Biomedicine-LLM-13B](https://huggingface.co/AdaptLLM/medicine-LLM-13B), [Finance-LLM-13B](https://huggingface.co/AdaptLLM/finance-LLM-13B) and [Law-LLM-13B](https://huggingface.co/AdaptLLM/law-LLM-13B).
## Domain-Specific LLaMA-2-Chat
Our method is also effective for aligned models! LLaMA-2-Chat requires a [specific data format](https://huggingface.co/blog/llama2#how-to-prompt-llama-2), and our **reading comprehension can perfectly fit the data format** by transforming the reading comprehension into a multi-turn conversation. We have also open-sourced chat models in different domains: [Biomedicine-Chat](https://huggingface.co/AdaptLLM/medicine-chat), [Finance-Chat](https://huggingface.co/AdaptLLM/finance-chat) and [Law-Chat](https://huggingface.co/AdaptLLM/law-chat)
For example, to chat with the law-chat model:
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("AdaptLLM/law-chat")
tokenizer = AutoTokenizer.from_pretrained("AdaptLLM/law-chat")
# Put your input here:
user_input = '''Question: Which of the following is false about ex post facto laws?
Options:
- They make criminal an act that was innocent when committed.
- They prescribe greater punishment for an act than was prescribed when it was done.
- They increase the evidence required to convict a person than when the act was done.
- They alter criminal offenses or punishment in a substantially prejudicial manner for the purpose of punishing a person for some past activity.
Please provide your choice first and then provide explanations if possible.'''
# Apply the prompt template and system prompt of LLaMA-2-Chat demo for chat models (NOTE: NO prompt template is required for base models!)
our_system_prompt = "\nYou are a helpful, respectful and honest assistant. Always answer as helpfully as possible, while being safe. Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses are socially unbiased and positive in nature.\n\nIf a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you don't know the answer to a question, please don't share false information.\n" # Please do NOT change this
prompt = f"<s>[INST] <<SYS>>{our_system_prompt}<</SYS>>\n\n{user_input} [/INST]"
# # NOTE:
# # If you want to apply your own system prompt, please integrate it into the instruction part following our system prompt like this:
# your_system_prompt = "Please, answer this question faithfully."
# prompt = f"<s>[INST] <<SYS>>{our_system_prompt}<</SYS>>\n\n{your_system_prompt}\n{user_input} [/INST]"
inputs = tokenizer(prompt, return_tensors="pt", add_special_tokens=False).input_ids.to(model.device)
outputs = model.generate(input_ids=inputs, max_length=4096)[0]
answer_start = int(inputs.shape[-1])
pred = tokenizer.decode(outputs[answer_start:], skip_special_tokens=True)
print(f'### User Input:\n{user_input}\n\n### Assistant Output:\n{pred}')
```
## Domain-Specific Tasks
### Pre-templatized/Formatted Testing Splits
To easily reproduce our prompting results, we have uploaded the filled-in zero/few-shot input instructions and output completions of the test each domain-specific task: [biomedicine-tasks](https://huggingface.co/datasets/AdaptLLM/medicine-tasks), [finance-tasks](https://huggingface.co/datasets/AdaptLLM/finance-tasks), and [law-tasks](https://huggingface.co/datasets/AdaptLLM/law-tasks).
**Note:** those filled-in instructions are specifically tailored for models before alignment and do NOT fit for the specific data format required for chat models.
### Raw Datasets
We have also uploaded the raw training and testing splits, for facilitating fine-tuning or other usages:
- [ChemProt](https://huggingface.co/datasets/AdaptLLM/ChemProt)
- [RCT](https://huggingface.co/datasets/AdaptLLM/RCT)
- [ConvFinQA](https://huggingface.co/datasets/AdaptLLM/ConvFinQA)
- [FiQA_SA](https://huggingface.co/datasets/AdaptLLM/FiQA_SA)
- [Headline](https://huggingface.co/datasets/AdaptLLM/Headline)
- [NER](https://huggingface.co/datasets/AdaptLLM/NER)
- [FPB](https://huggingface.co/datasets/AdaptLLM/FPB)
The other datasets used in our paper have already been available in huggingface, and you can directly load them with the following code:
```python
from datasets import load_dataset
# MQP:
dataset = load_dataset('medical_questions_pairs')
# PubmedQA:
dataset = load_dataset('bigbio/pubmed_qa')
# USMLE:
dataset=load_dataset('GBaker/MedQA-USMLE-4-options')
# SCOTUS
dataset = load_dataset("lex_glue", 'scotus')
# CaseHOLD
dataset = load_dataset("lex_glue", 'case_hold')
# UNFAIR-ToS
dataset = load_dataset("lex_glue", 'unfair_tos')
```
## Citation
If you find our work helpful, please cite us:
```bibtex
@inproceedings{
cheng2024adapting,
title={Adapting Large Language Models via Reading Comprehension},
author={Daixuan Cheng and Shaohan Huang and Furu Wei},
booktitle={The Twelfth International Conference on Learning Representations},
year={2024},
url={https://openreview.net/forum?id=y886UXPEZ0}
}
``` |
AdapterOcean/med_alpaca_standardized_cluster_47_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 11417125
num_examples: 8087
download_size: 6032990
dataset_size: 11417125
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_47_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shermansiu/sgd_dst | ---
license: cc-by-sa-4.0
tags:
- natural-language-processing
- dialogue-state-tracking
language:
- en
pretty_name: Schema-Guided Dialogue dataset - Dialogue State Tracking
configs:
- config_name: sgd-sdt
data_files:
- split: train
path: "sdt/sgd_sdt_v0_train.tsv"
- split: dev
path: "sdt/sgd_sdt_v0_dev.tsv"
- split: test
path: "sdt/sgd_sdt_v0_test.tsv"
- config_name: sgd-d3st
data_files:
- split: train
path: "d3st/sgd_d3st_v0_train.tsv"
- split: dev
path: "d3st/sgd_d3st_v0_dev.tsv"
- split: test
path: "d3st/sgd_d3st_v0_test.tsv"
dataset_info:
- config_name: sgd-sdt
features:
- name: prompt
dtype: string
- name: target
dtype: string
- name: dialogue_id
dtype: string
- name: turn_id
dtype: string
- name: frame_id
dtype: string
splits:
- name: train
num_bytes: 269690581
num_examples: 175780
- name: dev
num_bytes: 38376392
num_examples: 26077
- name: test
num_bytes: 72134551
num_examples: 46116
- config_name: sgd-d3st
features:
- name: prompt
dtype: string
- name: target
dtype: string
- name: dialogue_id
dtype: string
- name: turn_id
dtype: string
- name: frame_id
dtype: string
splits:
- name: train
num_bytes: 226090375
num_examples: 175780
- name: dev
num_bytes: 32170435
num_examples: 26077
- name: test
num_bytes: 59997238
num_examples: 46116
---
# Schema-Guided Dialogue dataset - Dialogue State Tracking
This dataset contains the Schema-Guided Dialogue Dataset, formatted according to the prompt formats from the following two dialogue state tracking papers:
- Description-Driven Dialogue State Tracking (D3ST) ([Zhao et al., 2022](https://arxiv.org/abs/2201.08904))
- Show, Don't Tell (SDT) ([Gupta et al., 2022](https://arxiv.org/abs/2204.04327))
Data processing code: https://github.com/google-research/task-oriented-dialogue
Original dataset: https://github.com/google-research-datasets/dstc8-schema-guided-dialogue
|
indiehackers/Telugu_InstructData | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 82739156
num_examples: 33350
download_size: 32498948
dataset_size: 82739156
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: mit
task_categories:
- text-generation
language:
- te
size_categories:
- 10K<n<100K
---
This dataset is a translated version of three original datasets, namely [HuggingFaceH4/no_robots](https://huggingface.co/datasets/HuggingFaceH4/no_robots), [databricks/databricks-dolly-15k](https://huggingface.co/datasets/databricks/dolly-15k), and a subset of Telugu from [CohereForAI/aya_dataset](https://huggingface.co/datasets/CohereForAI/aya_dataset).. |
open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0 | ---
pretty_name: Evaluation run of EmbeddedLLM/Mistral-7B-Merge-14-v0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [EmbeddedLLM/Mistral-7B-Merge-14-v0](https://huggingface.co/EmbeddedLLM/Mistral-7B-Merge-14-v0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-18T09:26:04.672254](https://huggingface.co/datasets/open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0/blob/main/results_2023-12-18T09-26-04.672254.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6495541508127465,\n\
\ \"acc_stderr\": 0.03195905121079232,\n \"acc_norm\": 0.6503791395121084,\n\
\ \"acc_norm_stderr\": 0.03260613037981816,\n \"mc1\": 0.41003671970624234,\n\
\ \"mc1_stderr\": 0.017217844717449325,\n \"mc2\": 0.5858188205527958,\n\
\ \"mc2_stderr\": 0.015472440902902757\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6271331058020477,\n \"acc_stderr\": 0.014131176760131169,\n\
\ \"acc_norm\": 0.658703071672355,\n \"acc_norm_stderr\": 0.013855831287497728\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6736705835490938,\n\
\ \"acc_stderr\": 0.004679111783653906,\n \"acc_norm\": 0.8552081258713403,\n\
\ \"acc_norm_stderr\": 0.0035117170854519824\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997695,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997695\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8064516129032258,\n\
\ \"acc_stderr\": 0.022475258525536057,\n \"acc_norm\": 0.8064516129032258,\n\
\ \"acc_norm_stderr\": 0.022475258525536057\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.02399150050031304,\n \
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.02399150050031304\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113115,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113115\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233483,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233483\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n\
\ \"acc_stderr\": 0.0133064782430663,\n \"acc_norm\": 0.8339719029374202,\n\
\ \"acc_norm_stderr\": 0.0133064782430663\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.02370309952525817,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.02370309952525817\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38100558659217876,\n\
\ \"acc_stderr\": 0.016242028834053616,\n \"acc_norm\": 0.38100558659217876,\n\
\ \"acc_norm_stderr\": 0.016242028834053616\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829727,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829727\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n\
\ \"acc_stderr\": 0.012739711554045706,\n \"acc_norm\": 0.4654498044328553,\n\
\ \"acc_norm_stderr\": 0.012739711554045706\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7022058823529411,\n \"acc_stderr\": 0.02777829870154544,\n\
\ \"acc_norm\": 0.7022058823529411,\n \"acc_norm_stderr\": 0.02777829870154544\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \
\ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.41003671970624234,\n\
\ \"mc1_stderr\": 0.017217844717449325,\n \"mc2\": 0.5858188205527958,\n\
\ \"mc2_stderr\": 0.015472440902902757\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7963693764798737,\n \"acc_stderr\": 0.011317798781626918\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6671721000758151,\n \
\ \"acc_stderr\": 0.012979892496598281\n }\n}\n```"
repo_url: https://huggingface.co/EmbeddedLLM/Mistral-7B-Merge-14-v0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|arc:challenge|25_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|gsm8k|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hellaswag|10_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T09-26-04.672254.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-18T09-26-04.672254.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- '**/details_harness|winogrande|5_2023-12-18T09-26-04.672254.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-18T09-26-04.672254.parquet'
- config_name: results
data_files:
- split: 2023_12_18T09_26_04.672254
path:
- results_2023-12-18T09-26-04.672254.parquet
- split: latest
path:
- results_2023-12-18T09-26-04.672254.parquet
---
# Dataset Card for Evaluation run of EmbeddedLLM/Mistral-7B-Merge-14-v0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [EmbeddedLLM/Mistral-7B-Merge-14-v0](https://huggingface.co/EmbeddedLLM/Mistral-7B-Merge-14-v0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-18T09:26:04.672254](https://huggingface.co/datasets/open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0/blob/main/results_2023-12-18T09-26-04.672254.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6495541508127465,
"acc_stderr": 0.03195905121079232,
"acc_norm": 0.6503791395121084,
"acc_norm_stderr": 0.03260613037981816,
"mc1": 0.41003671970624234,
"mc1_stderr": 0.017217844717449325,
"mc2": 0.5858188205527958,
"mc2_stderr": 0.015472440902902757
},
"harness|arc:challenge|25": {
"acc": 0.6271331058020477,
"acc_stderr": 0.014131176760131169,
"acc_norm": 0.658703071672355,
"acc_norm_stderr": 0.013855831287497728
},
"harness|hellaswag|10": {
"acc": 0.6736705835490938,
"acc_stderr": 0.004679111783653906,
"acc_norm": 0.8552081258713403,
"acc_norm_stderr": 0.0035117170854519824
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997695,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997695
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8064516129032258,
"acc_stderr": 0.022475258525536057,
"acc_norm": 0.8064516129032258,
"acc_norm_stderr": 0.022475258525536057
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.02399150050031304,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.02399150050031304
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.02889774874113115,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.02889774874113115
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233483,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233483
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.0133064782430663,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.0133064782430663
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.02370309952525817,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.02370309952525817
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38100558659217876,
"acc_stderr": 0.016242028834053616,
"acc_norm": 0.38100558659217876,
"acc_norm_stderr": 0.016242028834053616
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829727,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829727
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.012739711554045706,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.012739711554045706
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7022058823529411,
"acc_stderr": 0.02777829870154544,
"acc_norm": 0.7022058823529411,
"acc_norm_stderr": 0.02777829870154544
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.01904748523936038,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.01904748523936038
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.41003671970624234,
"mc1_stderr": 0.017217844717449325,
"mc2": 0.5858188205527958,
"mc2_stderr": 0.015472440902902757
},
"harness|winogrande|5": {
"acc": 0.7963693764798737,
"acc_stderr": 0.011317798781626918
},
"harness|gsm8k|5": {
"acc": 0.6671721000758151,
"acc_stderr": 0.012979892496598281
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Mohamad-Jaallouk/SnowTest | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: test
num_bytes: 8657666.0
num_examples: 36
download_size: 8585450
dataset_size: 8657666.0
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
Tngarg/Codemix_tamil_english_train | ---
dataset_info:
features:
- name: tweet
dtype: string
- name: sentiment
dtype: string
splits:
- name: train
num_bytes: 1934489.5931346258
num_examples: 25840
download_size: 1135012
dataset_size: 1934489.5931346258
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Codemix_tamil_english_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-high_school_psychology | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: fewshot_context_neg
dtype: string
splits:
- name: dev
num_bytes: 5820
num_examples: 5
- name: test
num_bytes: 2627471
num_examples: 545
download_size: 247537
dataset_size: 2633291
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu-high_school_psychology"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nxsbr/schmidt | ---
license: openrail
---
|
SilentAntagonist/dalle3-reddit-cogvlm-captions | ---
license: cc-by-nc-4.0
---
# DALL-E 3 text-image pairs
## Description
This dataset contains DALL-E images posted on Reddit (mixed with other AI-generated images from Midjourney and Stable Diffusion). |
Nerfgun3/star_style | ---
language:
- en
license: creativeml-openrail-m
thumbnail: "https://huggingface.co/datasets/Nerfgun3/star_style/resolve/main/star_showcase.png"
tags:
- stable-diffusion
- text-to-image
- image-to-image
inference: false
---
# Star Style Embedding / Textual Inversion
<img alt="Showcase" src="https://huggingface.co/datasets/Nerfgun3/star_style/resolve/main/star_showcase.png"/>
## Usage
To use this embedding you have to download the file aswell as drop it into the "\stable-diffusion-webui\embeddings" folder
To use it in a prompt: ```"star_style"```
Personally, I would recommend to use my embeddings with a strength of 0.8, like ```"(star_style:0.8)"```
This embedding can be used for characters aswell! Just use it with a strength of 0.6 or less!
I hope you enjoy the embedding. If you have any questions, you can ask me anything via Discord: "Nerfgun3#7508"
## License
This embedding is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
1. You can't use the embedding to deliberately produce nor share illegal or harmful outputs or content
2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the embedding commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
[Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license) |
open-llm-leaderboard/details_DreadPoor__ToppyLake-7B-slerp | ---
pretty_name: Evaluation run of DreadPoor/ToppyLake-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DreadPoor/ToppyLake-7B-slerp](https://huggingface.co/DreadPoor/ToppyLake-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DreadPoor__ToppyLake-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-13T17:37:50.313114](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__ToppyLake-7B-slerp/blob/main/results_2024-02-13T17-37-50.313114.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6527643322107648,\n\
\ \"acc_stderr\": 0.03209219499038095,\n \"acc_norm\": 0.65299648074574,\n\
\ \"acc_norm_stderr\": 0.032754010922687565,\n \"mc1\": 0.4700122399020808,\n\
\ \"mc1_stderr\": 0.01747199209169754,\n \"mc2\": 0.6253804109646736,\n\
\ \"mc2_stderr\": 0.01533242283561269\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6706484641638225,\n \"acc_stderr\": 0.013734057652635474,\n\
\ \"acc_norm\": 0.6919795221843004,\n \"acc_norm_stderr\": 0.013491429517292038\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6954789882493527,\n\
\ \"acc_stderr\": 0.004592637369905785,\n \"acc_norm\": 0.8698466440948018,\n\
\ \"acc_norm_stderr\": 0.0033578442491239546\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944433,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944433\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7870967741935484,\n \"acc_stderr\": 0.02328766512726854,\n \"\
acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.02328766512726854\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"\
acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328972,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328972\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6846153846153846,\n \"acc_stderr\": 0.02355964698318994,\n \
\ \"acc_norm\": 0.6846153846153846,\n \"acc_norm_stderr\": 0.02355964698318994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857416,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857416\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553332,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553332\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128137,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128137\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n\
\ \"acc_stderr\": 0.013265346261323788,\n \"acc_norm\": 0.8352490421455939,\n\
\ \"acc_norm_stderr\": 0.013265346261323788\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500097,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500097\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3888268156424581,\n\
\ \"acc_stderr\": 0.016303899530796123,\n \"acc_norm\": 0.3888268156424581,\n\
\ \"acc_norm_stderr\": 0.016303899530796123\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.02555316999182652,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.02555316999182652\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042107,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042107\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4602346805736636,\n\
\ \"acc_stderr\": 0.01272978538659856,\n \"acc_norm\": 0.4602346805736636,\n\
\ \"acc_norm_stderr\": 0.01272978538659856\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4700122399020808,\n\
\ \"mc1_stderr\": 0.01747199209169754,\n \"mc2\": 0.6253804109646736,\n\
\ \"mc2_stderr\": 0.01533242283561269\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8279400157853196,\n \"acc_stderr\": 0.010607731615247012\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6595905989385898,\n \
\ \"acc_stderr\": 0.013052097103299102\n }\n}\n```"
repo_url: https://huggingface.co/DreadPoor/ToppyLake-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|arc:challenge|25_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|gsm8k|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hellaswag|10_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T17-37-50.313114.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-13T17-37-50.313114.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- '**/details_harness|winogrande|5_2024-02-13T17-37-50.313114.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-13T17-37-50.313114.parquet'
- config_name: results
data_files:
- split: 2024_02_13T17_37_50.313114
path:
- results_2024-02-13T17-37-50.313114.parquet
- split: latest
path:
- results_2024-02-13T17-37-50.313114.parquet
---
# Dataset Card for Evaluation run of DreadPoor/ToppyLake-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DreadPoor/ToppyLake-7B-slerp](https://huggingface.co/DreadPoor/ToppyLake-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DreadPoor__ToppyLake-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T17:37:50.313114](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__ToppyLake-7B-slerp/blob/main/results_2024-02-13T17-37-50.313114.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6527643322107648,
"acc_stderr": 0.03209219499038095,
"acc_norm": 0.65299648074574,
"acc_norm_stderr": 0.032754010922687565,
"mc1": 0.4700122399020808,
"mc1_stderr": 0.01747199209169754,
"mc2": 0.6253804109646736,
"mc2_stderr": 0.01533242283561269
},
"harness|arc:challenge|25": {
"acc": 0.6706484641638225,
"acc_stderr": 0.013734057652635474,
"acc_norm": 0.6919795221843004,
"acc_norm_stderr": 0.013491429517292038
},
"harness|hellaswag|10": {
"acc": 0.6954789882493527,
"acc_stderr": 0.004592637369905785,
"acc_norm": 0.8698466440948018,
"acc_norm_stderr": 0.0033578442491239546
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.040943762699967926,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.040943762699967926
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944433,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726854,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726854
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328972,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328972
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6846153846153846,
"acc_stderr": 0.02355964698318994,
"acc_norm": 0.6846153846153846,
"acc_norm_stderr": 0.02355964698318994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857416,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553332,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553332
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128137,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8352490421455939,
"acc_stderr": 0.013265346261323788,
"acc_norm": 0.8352490421455939,
"acc_norm_stderr": 0.013265346261323788
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500097,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500097
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3888268156424581,
"acc_stderr": 0.016303899530796123,
"acc_norm": 0.3888268156424581,
"acc_norm_stderr": 0.016303899530796123
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.02555316999182652,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.02555316999182652
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042107,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042107
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4602346805736636,
"acc_stderr": 0.01272978538659856,
"acc_norm": 0.4602346805736636,
"acc_norm_stderr": 0.01272978538659856
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4700122399020808,
"mc1_stderr": 0.01747199209169754,
"mc2": 0.6253804109646736,
"mc2_stderr": 0.01533242283561269
},
"harness|winogrande|5": {
"acc": 0.8279400157853196,
"acc_stderr": 0.010607731615247012
},
"harness|gsm8k|5": {
"acc": 0.6595905989385898,
"acc_stderr": 0.013052097103299102
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
eduardo654nff/kakashinn | ---
license: openrail
---
|
projecte-aina/openslr-slr69-ca-trimmed-denoised | ---
annotations_creators:
- no-annotation
language_creators:
- crowdsourced
language:
- ca
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets: openslr
task_categories:
- text-to-speech
task_ids: []
pretty_name: openslr-slr69-ca-trimmed-denoised
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 811311975.4
num_examples: 4240
download_size: 721217811
dataset_size: 811311975.4
---
# Dataset Card for openslr-slr69-ca-denoised
This is a post-processed version of the Catalan subset belonging to the [Open Speech and Language Resources (OpenSLR)](https://www.openslr.org/index.html) speech dataset.
Specifically the subset [OpenSLR-69](https://www.openslr.org/69).
The original HF🤗 SLR-69 dataset is located [here](https://huggingface.co/datasets/openslr/viewer/SLR69).
Same license is maintained: [Attribution-ShareAlike 4.0 International](https://creativecommons.org/licenses/by/4.0/).
## Dataset Details
### Dataset Description
We processed the data of the Catalan OpenSLR with the following recipe:
- **Trimming:** Long silences from the start and the end of clips have been removed.
- [py-webrtcvad](https://pypi.org/project/webrtcvad/) -> Python interface to the Voice Activity Detector (VAD) developed by Google for the WebRTC.
- **Resampling:** From 48000 Hz to 22050 Hz, which is the most common sampling rate for training TTS models
- Resampler from [CoquiTTS](https://github.com/coqui-ai/TTS/tree/dev) framework
- **Denoising:** Although base quality of the audios is high, we could remove some background noise and small artifcats thanks to the CleanUNet denoiser developed by NVIDIA.
- [CleanUNet](https://github.com/NVIDIA/CleanUNet) - [arXiv](https://arxiv.org/abs/2202.07790)
We kept the same number of wave files, also the original anonymized file names and transcriptions.
## Uses
The purpose of this dataset is mainly for training text-to-speech and automatic speech recognition models in Catalan.
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
The dataset consists of a single split, providing audios and transcriptions:
```
DatasetDict({
train: Dataset({
features: ['audio', 'transcription'],
num_rows: 4240
})
})
```
Each data point is structured as:
```
>> data['train'][0]['audio']
{'path': 'caf_09901_01619988267.wav',
'array': array([-3.05175781e-05, -3.05175781e-05, -3.05175781e-05, ..., -6.10351562e-05, -6.10351562e-05, -6.10351562e-05])
'sampling_rate': 22050}
>> data['train'][0]['transcription']
"L'òpera de Sydney es troba a l'entrada de la badia"
```
### Dataset Splits
- ```audio (dict)```: A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: ```dataset[0]["audio"]``` the audio file is automatically decoded and resampled to ```dataset.features["audio"].sampling_rate```. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus, it is important to first query the sample index before the "audio" column, i.e. ```dataset[0]["audio"]``` should always be preferred over ```dataset["audio"][0]```.
* ```path (str)```: The path to the audio file.
* ```array (array)```: Decoded audio array.
* ```sampling_rate (int)```: Audio sampling rate.
- ```transcription (str)```: The sentence the user was prompted to speak.
## Dataset Creation
### Source Data
*SLR69: Crowdsourced high-quality Catalan multi-speaker speech data set*
This data set contains transcribed high-quality audio of Catalan sentences recorded by volunteers. The recordings
were prepared with the help of Direcció General de Política Lingüística del Departament de Cultura, Generalitat de
Catalunya. The data set consists of wave files, and a TSV file (line_index.tsv). The file line_index.tsv contains
an anonymized FileID and the transcription of audio in the file.
The data set has been manually quality checked, but there might still be errors.
Please report any issues in the following issue tracker on GitHub. https://github.com/googlei18n/language-resources/issues
The original dataset is distributed under Creative Commons Attribution-ShareAlike 4.0 International Public License.
See [LICENSE](https://www.openslr.org/resources/69/LICENSE) file and
[https://github.com/google/language-resources#license](https://github.com/google/language-resources#license) for license information.
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
This is a post-processed version of the Catalan [OpenSLR-69](https://www.openslr.org/69) dataset.
For more inormation about the original data collection and processing refer to [this paper](https://aclanthology.org/2020.sltu-1.3/).
#### Who are the source data producers?
Copyright 2018, 2019 Google, Inc.
Copyright 2023 Language Technologies Unit (LangTech) at Barcelona Supercomputing Center
### Annotations
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
(N/A)
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in this dataset.
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
This dataset is a post-processed version of another previously created dataset. Please, refer to its documentation to know about any possible risks, biases and limitations.
## Citation
The original paper where authors detail how OpenSLR-69 was generated:
```
@inproceedings{kjartansson-etal-2020-open,
title = {{Open-Source High Quality Speech Datasets for Basque, Catalan and Galician}},
author = {Kjartansson, Oddur and Gutkin, Alexander and Butryna, Alena and Demirsahin, Isin and Rivera, Clara},
booktitle = {Proceedings of the 1st Joint Workshop on Spoken Language Technologies for Under-resourced languages (SLTU) and Collaboration and Computing for Under-Resourced Languages (CCURL)},
year = {2020},
pages = {21--27},
month = may,
address = {Marseille, France},
publisher = {European Language Resources association (ELRA)},
url = {https://www.aclweb.org/anthology/2020.sltu-1.3},
ISBN = {979-10-95546-35-1},
}
```
**APA:**
## Funding
This work has been promoted and financed by the Generalitat de Catalunya through the [Aina project](https://projecteaina.cat/).
## Dataset Card Contact
langtech@bsc.es
|
mindchain/Text_Classification_Deutsch_Beispiel | ---
license: apache-2.0
language:
- de
size_categories:
- 1K<n<10K
--- |
ThWu/mmlu_test_val_split | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: choices
struct:
- name: gpt-4-1106-preview
dtype: string
- name: mixtral-8x7b-instruct-v0.1
dtype: string
- name: formatted_propmt
dtype: string
- name: label
dtype: string
- name: subject
dtype: string
- name: prompt_id
dtype: int64
- name: winner
dtype: string
splits:
- name: val
num_bytes: 5321330
num_examples: 1531
- name: test
num_bytes: 48566928
num_examples: 14042
download_size: 11932844
dataset_size: 53888258
configs:
- config_name: default
data_files:
- split: val
path: data/val-*
- split: test
path: data/test-*
---
|
Vextwix/Yes | ---
license: creativeml-openrail-m
---
|
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-q_k_v_o | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-q_k_v_o\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T03:32:51.454817](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-q_k_v_o/blob/main/results_2023-10-28T03-32-51.454817.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.37258808724832215,\n\
\ \"em_stderr\": 0.004951428522573584,\n \"f1\": 0.41863255033557134,\n\
\ \"f1_stderr\": 0.004838761301543826,\n \"acc\": 0.4445987937813739,\n\
\ \"acc_stderr\": 0.010466651540029098\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.37258808724832215,\n \"em_stderr\": 0.004951428522573584,\n\
\ \"f1\": 0.41863255033557134,\n \"f1_stderr\": 0.004838761301543826\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12282031842304776,\n \
\ \"acc_stderr\": 0.009041108602874675\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183524\n\
\ }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|arc:challenge|25_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T03_32_51.454817
path:
- '**/details_harness|drop|3_2023-10-28T03-32-51.454817.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T03-32-51.454817.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T03_32_51.454817
path:
- '**/details_harness|gsm8k|5_2023-10-28T03-32-51.454817.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T03-32-51.454817.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hellaswag|10_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-48-52.263585.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T09-48-52.263585.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T09-48-52.263585.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T03_32_51.454817
path:
- '**/details_harness|winogrande|5_2023-10-28T03-32-51.454817.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T03-32-51.454817.parquet'
- config_name: results
data_files:
- split: 2023_10_10T09_48_52.263585
path:
- results_2023-10-10T09-48-52.263585.parquet
- split: 2023_10_28T03_32_51.454817
path:
- results_2023-10-28T03-32-51.454817.parquet
- split: latest
path:
- results_2023-10-28T03-32-51.454817.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-q_k_v_o",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T03:32:51.454817](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-q_k_v_o/blob/main/results_2023-10-28T03-32-51.454817.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.37258808724832215,
"em_stderr": 0.004951428522573584,
"f1": 0.41863255033557134,
"f1_stderr": 0.004838761301543826,
"acc": 0.4445987937813739,
"acc_stderr": 0.010466651540029098
},
"harness|drop|3": {
"em": 0.37258808724832215,
"em_stderr": 0.004951428522573584,
"f1": 0.41863255033557134,
"f1_stderr": 0.004838761301543826
},
"harness|gsm8k|5": {
"acc": 0.12282031842304776,
"acc_stderr": 0.009041108602874675
},
"harness|winogrande|5": {
"acc": 0.7663772691397001,
"acc_stderr": 0.011892194477183524
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
SageLiao/guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 0
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
version-control/ds-lib-version-3 | ---
dataset_info:
features:
- name: repo_name
dtype: string
- name: version
list:
- name: pyproject.toml
struct:
- name: matplotlib
dtype: string
- name: numpy
dtype: string
- name: pandas
dtype: string
- name: scikit-learn
dtype: string
- name: scipy
dtype: string
- name: tensorflow
dtype: string
- name: torch
dtype: string
- name: requirements.txt
struct:
- name: matplotlib
dtype: string
- name: numpy
dtype: string
- name: pandas
dtype: string
- name: scikit-learn
dtype: string
- name: scipy
dtype: string
- name: tensorflow
dtype: string
- name: torch
dtype: string
- name: setup.py
struct:
- name: matplotlib
dtype: string
- name: numpy
dtype: string
- name: pandas
dtype: string
- name: scikit-learn
dtype: string
- name: scipy
dtype: string
- name: tensorflow
dtype: string
- name: torch
dtype: string
- name: hexsha
sequence: string
splits:
- name: train
num_bytes: 2506652
num_examples: 10000
download_size: 824580
dataset_size: 2506652
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MatsuoDochiai/Roxy | ---
license: openrail
---
|
Amir13/wnut2017-persian | ---
annotations_creators:
- crowdsourced
language:
- fa
language_creators:
- machine-generated
license:
- other
multilinguality:
- monolingual
paperswithcode_id:
pretty_name: wnut2017-persian
size_categories:
- 1K<n<10K
tags:
- named entity recognition
task_categories:
- token-classification
task_ids:
- named-entity-recognition
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
If you used the datasets and models in this repository, please cite it.
```bibtex
@misc{https://doi.org/10.48550/arxiv.2302.09611,
doi = {10.48550/ARXIV.2302.09611},
url = {https://arxiv.org/abs/2302.09611},
author = {Sartipi, Amir and Fatemi, Afsaneh},
keywords = {Computation and Language (cs.CL), Artificial Intelligence (cs.AI), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Exploring the Potential of Machine Translation for Generating Named Entity Datasets: A Case Study between Persian and English},
publisher = {arXiv},
year = {2023},
copyright = {arXiv.org perpetual, non-exclusive license}
}
```
### Contributions
[More Information Needed]
|
sachith-surge/LaMini-LM-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML-eval-llama2-gpt4 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: source
dtype: string
- name: response
dtype: string
- name: llama2_status
dtype: string
- name: llama2_rating
dtype: string
- name: llama2_reason
dtype: string
- name: gpt4_status
dtype: string
- name: gpt4_rating
dtype: string
- name: gpt4_reason
dtype: string
splits:
- name: train
num_bytes: 2729018
num_examples: 1505
download_size: 1378351
dataset_size: 2729018
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "LaMini-LM-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML-eval-llama2-gpt4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_voidful__changpt-bart | ---
pretty_name: Evaluation run of voidful/changpt-bart
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [voidful/changpt-bart](https://huggingface.co/voidful/changpt-bart) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_voidful__changpt-bart\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-02T15:06:06.678906](https://huggingface.co/datasets/open-llm-leaderboard/details_voidful__changpt-bart/blob/main/results_2023-12-02T15-06-06.678906.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.0,\n \"\
acc_stderr\": 0.0\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \
\ \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/voidful/changpt-bart
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|arc:challenge|25_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|arc:challenge|25_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T22_50_05.507806
path:
- '**/details_harness|drop|3_2023-09-17T22-50-05.507806.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T22-50-05.507806.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T22_50_05.507806
path:
- '**/details_harness|gsm8k|5_2023-09-17T22-50-05.507806.parquet'
- split: 2023_12_02T15_06_06.678906
path:
- '**/details_harness|gsm8k|5_2023-12-02T15-06-06.678906.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-02T15-06-06.678906.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hellaswag|10_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hellaswag|10_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T19:52:50.972620.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T19:53:13.918423.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T19:52:50.972620.parquet'
- split: 2023_08_09T19_53_13.918423
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T19:53:13.918423.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T19:53:13.918423.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T22_50_05.507806
path:
- '**/details_harness|winogrande|5_2023-09-17T22-50-05.507806.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T22-50-05.507806.parquet'
- config_name: results
data_files:
- split: 2023_08_09T19_52_50.972620
path:
- results_2023-08-09T19:52:50.972620.parquet
- split: 2023_08_09T19_53_13.918423
path:
- results_2023-08-09T19:53:13.918423.parquet
- split: 2023_09_17T22_50_05.507806
path:
- results_2023-09-17T22-50-05.507806.parquet
- split: 2023_12_02T15_06_06.678906
path:
- results_2023-12-02T15-06-06.678906.parquet
- split: latest
path:
- results_2023-12-02T15-06-06.678906.parquet
---
# Dataset Card for Evaluation run of voidful/changpt-bart
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/voidful/changpt-bart
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [voidful/changpt-bart](https://huggingface.co/voidful/changpt-bart) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_voidful__changpt-bart",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-02T15:06:06.678906](https://huggingface.co/datasets/open-llm-leaderboard/details_voidful__changpt-bart/blob/main/results_2023-12-02T15-06-06.678906.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
polinaeterna/new_meta_format | ---
dataset_info:
- config_name: custom
features:
- name: x
dtype: int64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 1600
num_examples: 100
- name: random
num_bytes: 160
num_examples: 10
download_size: 3650
dataset_size: 1760
- config_name: default
features:
- name: x
dtype: int64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 1600
num_examples: 100
- name: random
num_bytes: 800
num_examples: 50
download_size: 0
dataset_size: 2400
builder_config:
- config_name: custom
data_files:
- split: train
pattern: custom/train-*
- split: random
pattern: custom/random-*
- config_name: default
data_files:
- split: train
pattern: data/train-*
- split: random
pattern: data/random-*
---
# Dataset Card for "new_meta_format"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Marchanjo/spider-es | ---
license: cc-by-sa-4.0
---
Distributed under the Creative Commons-by-sa-4.0 respecting the ShareAlike of the [Spider Dataset](https://yale-lily.github.io/spider).
Code explanations and links for the model's checkpoints and datasets are on Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql)
Here is the [Hugging Face collection](https://huggingface.co/collections/Marchanjo/mrat-sql-65a671743bb0e70b416561f6), you can download the model's checkpoints and datasets, but to understand is better to go to Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
# mRAT-SQL-FIT
## A Multilingual Translator to SQL with Database Schema Pruning to Improve Self-Attention
Marcelo Archanjo Jose, Fabio Gagliardi Cozman
Long sequences of text are challenging in the context of transformers, due to quadratic memory increase in the self-attention mechanism. As this issue directly affects the translation from natural language to SQL queries (as techniques usually take as input a concatenated text with the question and the database schema), we present techniques that allow long text sequences to be handled by transformers with up to 512 input tokens. We propose a training process with database schema pruning (removal of tables and columns names that are useless for the query of interest). In addition, we used a multilingual approach with the mT5-large model fine-tuned with a data-augmented Spider dataset in four languages simultaneously: English, Portuguese, Spanish, and French. Our proposed technique used the Spider dataset and increased the exact set match accuracy results from 0.718 to 0.736 in a validation dataset (Dev). Source code, evaluations, and checkpoints are available at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
[paper published in Springer-Nature - International Journal of Information Technology](https://doi.org/10.1007/s41870-023-01342-3), [here the SharedIt link](https://rdcu.be/dff19). [here the pre-print in arXiv](https://arxiv.org/abs/2306.14256).
# mRAT-SQL+GAP
## mRAT-SQL+GAP:A Portuguese Text-to-SQL Transformer
Marcelo Archanjo José, Fabio Gagliardi Cozman
The translation of natural language questions to SQL queries has attracted growing attention, in particular in connection with transformers and similar language models. A large number of techniques are geared towards the English language; in this work, we thus investigated translation to SQL when input questions are given in the Portuguese language. To do so, we properly adapted state-of-the-art tools and resources. We changed the RAT-SQL+GAP system by relying on a multilingual BART model (we report tests with other language models), and we produced a translated version of the Spider dataset. Our experiments expose interesting phenomena that arise when non-English languages are targeted; in particular, it is better to train with original and translated training datasets together, even if a single target language is desired. This multilingual BART model fine-tuned with a double-size training dataset (English and Portuguese) achieved 83% of the baseline, making inferences for the Portuguese test dataset. This investigation can help other researchers to produce results in Machine Learning in a language different from English. Our multilingual ready version of RAT-SQL+GAP and the data are available, open-sourced as mRAT-SQL+GAP at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
BRACIS 2021: [paper published in Springer Lecture Notes in Computer Science](https://link.springer.com/chapter/10.1007%2F978-3-030-91699-2_35), [here the pre-print in arXiv](https://arxiv.org/abs/2110.03546).
Based on: RAT-SQL+GAP: [Github](https://github.com/awslabs/gap-text2sql). Paper: [AAAI 2021 paper](https://arxiv.org/abs/2012.10309) |
arieg/bw_spec_cls_4_15_s_200 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '1278'
'1': '1417'
'2': '1427'
'3': '1443'
splits:
- name: train
num_bytes: 42892698.0
num_examples: 800
- name: test
num_bytes: 1071895.0
num_examples: 20
download_size: 38128383
dataset_size: 43964593.0
---
# Dataset Card for "bw_spec_cls_4_15_s_200"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shivanshjayara2991/ner_resume_data | ---
license: other
---
|
rdiazconcha/marketing-synthetic | ---
dataset_info:
features:
- name: product
dtype: string
- name: description
dtype: string
- name: marketing_email
dtype: string
splits:
- name: train
num_bytes: 20249
num_examples: 10
download_size: 27613
dataset_size: 20249
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "marketing-synthetic"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
davanstrien/autotrain-data-french-ner-blank-model | Invalid username or password. |
a98zhang/ibm_argument_example | ---
pretty_name: example_ibm
--- |
ChristophSchuhmann/Chess-Selfplay | ---
license: apache-2.0
---
|
andersonbcdefg/MEDI-NQ-subset_with_margins | ---
dataset_info:
features:
- name: query
dtype: string
- name: pos
dtype: string
- name: neg
dtype: string
- name: source
dtype: string
- name: qp_sim
dtype: float32
- name: qn_sim
dtype: float32
- name: pn_sim
dtype: float32
- name: margin
dtype: float64
splits:
- name: train
num_bytes: 32571587.635637943
num_examples: 35797
download_size: 30942084
dataset_size: 32571587.635637943
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
cyrilzhang/TinyStories2-ascii-bpe-1600 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 2453198100
num_examples: 598341
- name: validation
num_bytes: 24690200
num_examples: 6022
download_size: 856137162
dataset_size: 2477888300
---
# Dataset Card for "TinyStories2-ascii-bpe-1600"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ChristophSchuhmann/Mega-Fast-KNN-Captioning | ---
license: apache-2.0
---
|
yjernite/prof_images_blip__Lykon-DreamShaper | ---
dataset_info:
features:
- name: images
dtype: image
- name: embeddings
sequence: float32
splits:
- name: bartender
num_bytes: 4232353.0
num_examples: 100
- name: facilities_manager
num_bytes: 3233702.0
num_examples: 100
- name: accountant
num_bytes: 3301253.0
num_examples: 100
- name: graphic_designer
num_bytes: 3779936.0
num_examples: 100
- name: financial_manager
num_bytes: 3032824.0
num_examples: 100
- name: baker
num_bytes: 3760855.0
num_examples: 100
- name: artist
num_bytes: 3321552.0
num_examples: 100
- name: author
num_bytes: 3841657.0
num_examples: 100
- name: clergy
num_bytes: 3326689.0
num_examples: 100
- name: customer_service_representative
num_bytes: 3353667.0
num_examples: 100
- name: dental_hygienist
num_bytes: 3116590.0
num_examples: 100
- name: electrician
num_bytes: 4444433.0
num_examples: 100
- name: head_cook
num_bytes: 3711054.0
num_examples: 100
- name: health_technician
num_bytes: 3208097.0
num_examples: 100
- name: carpet_installer
num_bytes: 4231786.0
num_examples: 100
- name: civil_engineer
num_bytes: 3887933.0
num_examples: 100
- name: ceo
num_bytes: 2725789.0
num_examples: 100
- name: computer_support_specialist
num_bytes: 3768802.0
num_examples: 100
- name: dentist
num_bytes: 3051311.0
num_examples: 100
- name: butcher
num_bytes: 4473092.0
num_examples: 100
- name: courier
num_bytes: 3220269.0
num_examples: 100
- name: computer_programmer
num_bytes: 4013303.0
num_examples: 100
- name: correctional_officer
num_bytes: 3250295.0
num_examples: 100
- name: executive_assistant
num_bytes: 3109178.0
num_examples: 100
- name: designer
num_bytes: 3360493.0
num_examples: 100
- name: groundskeeper
num_bytes: 3526805.0
num_examples: 100
- name: aerospace_engineer
num_bytes: 4889373.0
num_examples: 100
- name: data_entry_keyer
num_bytes: 3810901.0
num_examples: 100
- name: event_planner
num_bytes: 3416510.0
num_examples: 100
- name: cook
num_bytes: 3783118.0
num_examples: 100
- name: hairdresser
num_bytes: 3197788.0
num_examples: 100
- name: farmer
num_bytes: 4224326.0
num_examples: 100
- name: construction_worker
num_bytes: 3595787.0
num_examples: 100
- name: air_conditioning_installer
num_bytes: 4078377.0
num_examples: 100
- name: electrical_engineer
num_bytes: 5068341.0
num_examples: 100
- name: career_counselor
num_bytes: 3402257.0
num_examples: 100
- name: clerk
num_bytes: 3603897.0
num_examples: 100
- name: director
num_bytes: 3015590.0
num_examples: 100
- name: fast_food_worker
num_bytes: 3902204.0
num_examples: 100
- name: cleaner
num_bytes: 2822728.0
num_examples: 100
- name: computer_systems_analyst
num_bytes: 4211576.0
num_examples: 100
- name: dental_assistant
num_bytes: 3135047.0
num_examples: 100
- name: architect
num_bytes: 3334524.0
num_examples: 100
- name: drywall_installer
num_bytes: 3186332.0
num_examples: 100
- name: childcare_worker
num_bytes: 3723729.0
num_examples: 100
- name: file_clerk
num_bytes: 4124578.0
num_examples: 100
- name: community_manager
num_bytes: 2923881.0
num_examples: 100
- name: carpenter
num_bytes: 4186317.0
num_examples: 100
- name: claims_appraiser
num_bytes: 3668012.0
num_examples: 100
- name: dispatcher
num_bytes: 4311103.0
num_examples: 100
- name: cashier
num_bytes: 4015653.0
num_examples: 100
- name: detective
num_bytes: 2545399.0
num_examples: 100
- name: financial_advisor
num_bytes: 3101141.0
num_examples: 100
- name: engineer
num_bytes: 4143278.0
num_examples: 100
- name: dishwasher
num_bytes: 4891231.0
num_examples: 100
- name: fitness_instructor
num_bytes: 3356902.0
num_examples: 100
- name: credit_counselor
num_bytes: 3340328.0
num_examples: 100
- name: doctor
num_bytes: 3038762.0
num_examples: 100
- name: compliance_officer
num_bytes: 3241075.0
num_examples: 100
- name: aide
num_bytes: 3472385.0
num_examples: 100
- name: bus_driver
num_bytes: 4379280.0
num_examples: 100
- name: financial_analyst
num_bytes: 3730273.0
num_examples: 100
- name: firefighter
num_bytes: 4226861.0
num_examples: 100
- name: coach
num_bytes: 3364291.0
num_examples: 100
download_size: 243149155
dataset_size: 232746873.0
---
# Dataset Card for "prof_images_blip__Lykon-DreamShaper"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Back-up/chung-khoan-demo-13 | ---
dataset_info:
features:
- name: url
dtype: string
- name: title
dtype: string
- name: date
dtype: string
- name: view
struct:
- name: number_of_response
dtype: string
- name: number_of_view
dtype: string
- name: content
list:
- name: res
dtype: string
splits:
- name: train
num_bytes: 124799388
num_examples: 25736
download_size: 44124158
dataset_size: 124799388
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
amazingvince/sft-max-2.1 | ---
dataset_info:
features:
- name: text
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 9385512725
num_examples: 2363460
- name: test
num_bytes: 9211485
num_examples: 2366
download_size: 4947814472
dataset_size: 9394724210
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
CyberHarem/warspite_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of warspite/ウォースパイト/厌战 (Azur Lane)
This is the dataset of warspite/ウォースパイト/厌战 (Azur Lane), containing 222 images and their tags.
The core tags of this character are `blonde_hair, long_hair, purple_eyes, bangs, hair_between_eyes, headgear, sidelocks, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 222 | 273.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/warspite_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 222 | 159.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/warspite_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 531 | 339.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/warspite_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 222 | 243.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/warspite_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 531 | 476.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/warspite_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/warspite_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 44 |  |  |  |  |  | 1girl, solo, white_gloves, long_sleeves, epaulettes, side-tie_panties, black_panties, looking_at_viewer, white_scarf, jacket, holding_sword, simple_background, blush, open_mouth, white_background, smile |
| 1 | 10 |  |  |  |  |  | 1girl, christmas, looking_at_viewer, santa_costume, santa_hat, solo, white_scarf, earmuffs, white_pantyhose, hair_ears, holding, red_headwear, :d, blush, boots, fur_trim, open_mouth, red_dress, red_footwear, simple_background, white_background, animal_ears, bare_shoulders, bell, closed_mouth, detached_sleeves, gift, mittens, red_gloves, sack |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | white_gloves | long_sleeves | epaulettes | side-tie_panties | black_panties | looking_at_viewer | white_scarf | jacket | holding_sword | simple_background | blush | open_mouth | white_background | smile | christmas | santa_costume | santa_hat | earmuffs | white_pantyhose | hair_ears | holding | red_headwear | :d | boots | fur_trim | red_dress | red_footwear | animal_ears | bare_shoulders | bell | closed_mouth | detached_sleeves | gift | mittens | red_gloves | sack |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------------|:---------------|:-------------|:-------------------|:----------------|:--------------------|:--------------|:---------|:----------------|:--------------------|:--------|:-------------|:-------------------|:--------|:------------|:----------------|:------------|:-----------|:------------------|:------------|:----------|:---------------|:-----|:--------|:-----------|:------------|:---------------|:--------------|:-----------------|:-------|:---------------|:-------------------|:-------|:----------|:-------------|:-------|
| 0 | 44 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | | | | | | X | X | | | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
spycoder/rawruwurawr | ---
license: mit
---
|
CyberHarem/gridley_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of gridley/グリッドレイ/格里德利 (Azur Lane)
This is the dataset of gridley/グリッドレイ/格里德利 (Azur Lane), containing 12 images and their tags.
The core tags of this character are `blonde_hair, blue_eyes, bangs, hair_between_eyes, ahoge, long_hair, bow, hair_ornament, two_side_up, drill_hair, red_bow, animal_ears, deer_ears, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 12 | 16.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gridley_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 12 | 9.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gridley_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 30 | 20.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gridley_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 12 | 14.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gridley_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 30 | 29.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gridley_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/gridley_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | blush, 1girl, bare_shoulders, looking_at_viewer, smile, solo, holding, open_mouth, sleeveless, thighhighs, camera, christmas, red_dress, reindeer_antlers, santa_costume, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | blush | 1girl | bare_shoulders | looking_at_viewer | smile | solo | holding | open_mouth | sleeveless | thighhighs | camera | christmas | red_dress | reindeer_antlers | santa_costume | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-----------------|:--------------------|:--------|:-------|:----------|:-------------|:-------------|:-------------|:---------|:------------|:------------|:-------------------|:----------------|:-------------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_CalderaAI__13B-Legerdemain-L2 | ---
pretty_name: Evaluation run of CalderaAI/13B-Legerdemain-L2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CalderaAI/13B-Legerdemain-L2](https://huggingface.co/CalderaAI/13B-Legerdemain-L2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CalderaAI__13B-Legerdemain-L2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-12T20:33:10.328879](https://huggingface.co/datasets/open-llm-leaderboard/details_CalderaAI__13B-Legerdemain-L2/blob/main/results_2023-10-12T20-33-10.328879.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002726510067114094,\n\
\ \"em_stderr\": 0.0005340111700415904,\n \"f1\": 0.06216547818791966,\n\
\ \"f1_stderr\": 0.0013785278979549318,\n \"acc\": 0.4412861505062612,\n\
\ \"acc_stderr\": 0.010705008172209724\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.002726510067114094,\n \"em_stderr\": 0.0005340111700415904,\n\
\ \"f1\": 0.06216547818791966,\n \"f1_stderr\": 0.0013785278979549318\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13040181956027294,\n \
\ \"acc_stderr\": 0.0092756303245541\n },\n \"harness|winogrande|5\":\
\ {\n \"acc\": 0.7521704814522494,\n \"acc_stderr\": 0.01213438601986535\n\
\ }\n}\n```"
repo_url: https://huggingface.co/CalderaAI/13B-Legerdemain-L2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|arc:challenge|25_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_12T20_33_10.328879
path:
- '**/details_harness|drop|3_2023-10-12T20-33-10.328879.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-12T20-33-10.328879.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_12T20_33_10.328879
path:
- '**/details_harness|gsm8k|5_2023-10-12T20-33-10.328879.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-12T20-33-10.328879.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hellaswag|10_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T11:34:37.986977.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T11:34:37.986977.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T11:34:37.986977.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_12T20_33_10.328879
path:
- '**/details_harness|winogrande|5_2023-10-12T20-33-10.328879.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-12T20-33-10.328879.parquet'
- config_name: results
data_files:
- split: 2023_08_09T11_34_37.986977
path:
- results_2023-08-09T11:34:37.986977.parquet
- split: 2023_10_12T20_33_10.328879
path:
- results_2023-10-12T20-33-10.328879.parquet
- split: latest
path:
- results_2023-10-12T20-33-10.328879.parquet
---
# Dataset Card for Evaluation run of CalderaAI/13B-Legerdemain-L2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CalderaAI/13B-Legerdemain-L2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CalderaAI/13B-Legerdemain-L2](https://huggingface.co/CalderaAI/13B-Legerdemain-L2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CalderaAI__13B-Legerdemain-L2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-12T20:33:10.328879](https://huggingface.co/datasets/open-llm-leaderboard/details_CalderaAI__13B-Legerdemain-L2/blob/main/results_2023-10-12T20-33-10.328879.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.002726510067114094,
"em_stderr": 0.0005340111700415904,
"f1": 0.06216547818791966,
"f1_stderr": 0.0013785278979549318,
"acc": 0.4412861505062612,
"acc_stderr": 0.010705008172209724
},
"harness|drop|3": {
"em": 0.002726510067114094,
"em_stderr": 0.0005340111700415904,
"f1": 0.06216547818791966,
"f1_stderr": 0.0013785278979549318
},
"harness|gsm8k|5": {
"acc": 0.13040181956027294,
"acc_stderr": 0.0092756303245541
},
"harness|winogrande|5": {
"acc": 0.7521704814522494,
"acc_stderr": 0.01213438601986535
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
zolak/twitter_dataset_78_1713181846 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 280052
num_examples: 702
download_size: 147591
dataset_size: 280052
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MagedSaeed/AskTheTeacherDataset | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: index
dtype: string
- name: instruction_en
dtype: string
- name: output_en
dtype: string
splits:
- name: train
num_bytes: 649329
num_examples: 918
download_size: 320236
dataset_size: 649329
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
FudanSELab/CodeGen4Libs | ---
license: mit
tags:
- code-generation
pretty_name: CodeGen4Libs Dataset
size_categories:
- 100K<n<1M
---
# Dataset Card for FudanSELab CodeGen4Libs Dataset
## Dataset Description
- **Repository:** [GitHub Repository](https://github.com/FudanSELab/codegen4libs)
- **Paper:** [CodeGen4Libs: A Two-stage Approach for Library-oriented Code Generation](https://mingwei-liu.github.io/publication/2023-08-18-ase-CodeGen4Libs)
### Dataset Summary
This dataset is used in the ASE2023 paper titled ["CodeGen4Libs: A Two-stage Approach for Library-oriented Code Generation"](https://mingwei-liu.github.io/publication/2023-08-18-ase-CodeGen4Libs).
### Languages
[More Information Needed]
## Dataset Structure
```python
from datasets import load_dataset
dataset = load_dataset("FudanSELab/CodeGen4Libs")
DatasetDict({
train: Dataset({
features: ['id', 'method', 'clean_method', 'doc', 'comment', 'method_name', 'extra', 'imports_info', 'libraries_info', 'input_str', 'input_ids', 'tokenized_input_str', 'input_token_length', 'labels', 'tokenized_labels_str', 'labels_token_length', 'retrieved_imports_info', 'retrieved_code', 'imports', 'cluster_imports_info', 'libraries', 'attention_mask'],
num_rows: 391811
})
validation: Dataset({
features: ['id', 'method', 'clean_method', 'doc', 'comment', 'method_name', 'extra', 'imports_info', 'libraries_info', 'input_str', 'input_ids', 'tokenized_input_str', 'input_token_length', 'labels', 'tokenized_labels_str', 'labels_token_length', 'retrieved_imports_info', 'retrieved_code', 'imports', 'cluster_imports_info', 'libraries', 'attention_mask'],
num_rows: 5967
})
test: Dataset({
features: ['id', 'method', 'clean_method', 'doc', 'comment', 'method_name', 'extra', 'imports_info', 'libraries_info', 'input_str', 'input_ids', 'tokenized_input_str', 'input_token_length', 'labels', 'tokenized_labels_str', 'labels_token_length', 'retrieved_imports_info', 'retrieved_code', 'imports', 'cluster_imports_info', 'libraries', 'attention_mask'],
num_rows: 6002
})
})
```
### Data Fields
The specific data fields for each tuple are delineated as follows:
- id: the unique identifier for each tuple.
- method: the original method-level code for each tuple.
- clean_method: the ground-truth method-level code for each task.
- doc: the document of method-level code for each tuple.
- comment: the natural language description for each tuple.
- method_name: the name of the method.
- extra: extra information on the code repository to which the method level code belongs.
- license: the license of code repository.
- path: the path of code repository.
- repo_name: the name of code repository.
- size: the size of code repository.
- imports_info: the import statements for each tuple.
- libraries_info: the libraries info for each tuple.
- input_str: the design of model input.
- input_ids: the ids of tokenized input.
- tokenized_input_str: the tokenized input.
- input_token_length: the length of the tokenized input.
- labels: the ids of tokenized output.
- tokenized_labels_str: the tokenized output.
- labels_token_length: the length of the the tokenized output.
- retrieved_imports_info: the retrieved import statements for each tuple.
- retrieved_code: the retrieved method-level code for each tuple.
- imports: the imported packages of each import statement.
- cluster_imports_info: cluster import information of code.
- libraries: libraries used by the code.
- attention_mask: attention mask for the input.
### Data Splits
The dataset is splited into a training set, a validation set, and a test set, with 391811, 5967, and 6002 data rows respectively.
## Additional Information
### Citation Information
```
@inproceedings{ase2023codegen4libs,
author = {Mingwei Liu and Tianyong Yang and Yiling Lou and Xueying Du and Ying Wang and and Xin Peng},
title = {{CodeGen4Libs}: A Two-stage Approach for Library-oriented Code Generation},
booktitle = {38th {IEEE/ACM} International Conference on Automated Software Engineering,
{ASE} 2023, Kirchberg, Luxembourg, September 11-15, 2023},
pages = {0--0},
publisher = {{IEEE}},
year = {2023},
}
``` |
amishshah/imbalanced_4 | ---
dataset_info:
features:
- name: title
dtype: string
- name: label
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 45166669.74
num_examples: 27000
- name: test
num_bytes: 5018518.86
num_examples: 3000
download_size: 27086501
dataset_size: 50185188.6
---
# Dataset Card for "imbalanced_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
reyrg/thermal-camera_v2 | ---
license: cc-by-4.0
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 766087220.0
num_examples: 546
download_size: 49415770
dataset_size: 766087220.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ortofasfat/AAAI_Swahili_dataset | ---
license: cc
task_categories:
- question-answering
- text-generation
language:
- sw
- en
pretty_name: Dolly_Swahili
size_categories:
- 10K<n<100K
---
# README for Swahili Translated Dataset from Toloka
## Dataset Description
This dataset is a Dolly 15k translated from English to Swahili, filtered and processed using the Toloka platform. It includes various contexts, responses, and instructions from diverse domains, providing a rich resource for natural language processing tasks, particularly for those focusing on the Swahili language.
### Data Fields
- **task_id**: A unique identifier for each task in the dataset.
- **INPUT:context_tr**: The context of the task translated into Swahili.
- **INPUT:context_src**: The original context of the task in English.
- **INPUT:response_tr**: The response or answer to the task, translated into Swahili.
- **INPUT:response_src**: The original response or answer to the task in English.
- **INPUT:instruction_tr**: The instruction for the task, translated into Swahili.
- **INPUT:instruction_src**: The original instruction for the task in English.
- **toloka probabilities**: Probabilities indicating the confidence or quality score associated with each task, as evaluated on the Toloka platform.
### Data Format
The dataset is provided in CSV format.
## Dataset Creation
### Curation Rationale
This dataset was curated to enhance the resources available for Swahili natural language processing. It aims to provide a robust set of data for translation tasks, question-answering systems, and other language modeling tasks in Swahili.
### Source Data
#### Initial Data Collection and Normalization
The initial data collection was taken from Dolly-15K
### Dataset Curators
This dataset was curated by Nazar Beknazarov, leveraging the Toloka platform for translation and filtering.
### Citation Information
Please include citation information here, if available. |
anugrah1112/test_gpt | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 32008.0
num_examples: 2
- name: test
num_bytes: 16004
num_examples: 1
download_size: 30762
dataset_size: 48012.0
---
# Dataset Card for "test_gpt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_vicgalleorg__TruthfulQwen1.5-4B | ---
pretty_name: Evaluation run of vicgalleorg/TruthfulQwen1.5-4B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [vicgalleorg/TruthfulQwen1.5-4B](https://huggingface.co/vicgalleorg/TruthfulQwen1.5-4B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vicgalleorg__TruthfulQwen1.5-4B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-04T19:39:20.098262](https://huggingface.co/datasets/open-llm-leaderboard/details_vicgalleorg__TruthfulQwen1.5-4B/blob/main/results_2024-03-04T19-39-20.098262.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5591026780671433,\n\
\ \"acc_stderr\": 0.034055265814276585,\n \"acc_norm\": 0.5614839252312046,\n\
\ \"acc_norm_stderr\": 0.03474577841927621,\n \"mc1\": 0.32558139534883723,\n\
\ \"mc1_stderr\": 0.016403989469907822,\n \"mc2\": 0.5060376733678527,\n\
\ \"mc2_stderr\": 0.014856853869795477\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.439419795221843,\n \"acc_stderr\": 0.014503747823580125,\n\
\ \"acc_norm\": 0.4709897610921502,\n \"acc_norm_stderr\": 0.014586776355294316\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5285799641505676,\n\
\ \"acc_stderr\": 0.004981623292196192,\n \"acc_norm\": 0.7132045409281019,\n\
\ \"acc_norm_stderr\": 0.004513409114983842\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5849056603773585,\n \"acc_stderr\": 0.03032594578928611,\n\
\ \"acc_norm\": 0.5849056603773585,\n \"acc_norm_stderr\": 0.03032594578928611\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5138888888888888,\n\
\ \"acc_stderr\": 0.041795966175810016,\n \"acc_norm\": 0.5138888888888888,\n\
\ \"acc_norm_stderr\": 0.041795966175810016\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n\
\ \"acc_stderr\": 0.038073017265045125,\n \"acc_norm\": 0.5260115606936416,\n\
\ \"acc_norm_stderr\": 0.038073017265045125\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.032683358999363366,\n\
\ \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.032683358999363366\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4708994708994709,\n \"acc_stderr\": 0.025707658614154954,\n \"\
acc_norm\": 0.4708994708994709,\n \"acc_norm_stderr\": 0.025707658614154954\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.04263906892795133,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.04263906892795133\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6387096774193548,\n\
\ \"acc_stderr\": 0.02732754844795754,\n \"acc_norm\": 0.6387096774193548,\n\
\ \"acc_norm_stderr\": 0.02732754844795754\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124495,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124495\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7616580310880829,\n \"acc_stderr\": 0.030748905363909916,\n\
\ \"acc_norm\": 0.7616580310880829,\n \"acc_norm_stderr\": 0.030748905363909916\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5205128205128206,\n \"acc_stderr\": 0.02532966316348994,\n \
\ \"acc_norm\": 0.5205128205128206,\n \"acc_norm_stderr\": 0.02532966316348994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083025,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083025\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.542016806722689,\n \"acc_stderr\": 0.03236361111951941,\n \
\ \"acc_norm\": 0.542016806722689,\n \"acc_norm_stderr\": 0.03236361111951941\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7155963302752294,\n \"acc_stderr\": 0.019342036587702588,\n \"\
acc_norm\": 0.7155963302752294,\n \"acc_norm_stderr\": 0.019342036587702588\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"\
acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7009803921568627,\n \"acc_stderr\": 0.03213325717373617,\n \"\
acc_norm\": 0.7009803921568627,\n \"acc_norm_stderr\": 0.03213325717373617\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7130801687763713,\n \"acc_stderr\": 0.02944377302259469,\n \
\ \"acc_norm\": 0.7130801687763713,\n \"acc_norm_stderr\": 0.02944377302259469\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6098654708520179,\n\
\ \"acc_stderr\": 0.03273766725459157,\n \"acc_norm\": 0.6098654708520179,\n\
\ \"acc_norm_stderr\": 0.03273766725459157\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.04139112727635463,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.04139112727635463\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n\
\ \"acc_stderr\": 0.045245960070300476,\n \"acc_norm\": 0.6759259259259259,\n\
\ \"acc_norm_stderr\": 0.045245960070300476\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6196319018404908,\n \"acc_stderr\": 0.038142698932618374,\n\
\ \"acc_norm\": 0.6196319018404908,\n \"acc_norm_stderr\": 0.038142698932618374\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.024662496845209794,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.024662496845209794\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7394636015325671,\n\
\ \"acc_stderr\": 0.01569600856380708,\n \"acc_norm\": 0.7394636015325671,\n\
\ \"acc_norm_stderr\": 0.01569600856380708\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.661849710982659,\n \"acc_stderr\": 0.025469770149400172,\n\
\ \"acc_norm\": 0.661849710982659,\n \"acc_norm_stderr\": 0.025469770149400172\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6372549019607843,\n \"acc_stderr\": 0.027530078447110307,\n\
\ \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.027530078447110307\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6109324758842444,\n\
\ \"acc_stderr\": 0.027690337536485372,\n \"acc_norm\": 0.6109324758842444,\n\
\ \"acc_norm_stderr\": 0.027690337536485372\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5864197530864198,\n \"acc_stderr\": 0.02740204204026996,\n\
\ \"acc_norm\": 0.5864197530864198,\n \"acc_norm_stderr\": 0.02740204204026996\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.02968010556502904,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.02968010556502904\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41395045632333766,\n\
\ \"acc_stderr\": 0.012579699631289262,\n \"acc_norm\": 0.41395045632333766,\n\
\ \"acc_norm_stderr\": 0.012579699631289262\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.45955882352941174,\n \"acc_stderr\": 0.030273325077345755,\n\
\ \"acc_norm\": 0.45955882352941174,\n \"acc_norm_stderr\": 0.030273325077345755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.545751633986928,\n \"acc_stderr\": 0.0201429745537952,\n \
\ \"acc_norm\": 0.545751633986928,\n \"acc_norm_stderr\": 0.0201429745537952\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505418,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505418\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6489795918367347,\n \"acc_stderr\": 0.030555316755573637,\n\
\ \"acc_norm\": 0.6489795918367347,\n \"acc_norm_stderr\": 0.030555316755573637\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\
\ \"acc_stderr\": 0.03115715086935556,\n \"acc_norm\": 0.736318407960199,\n\
\ \"acc_norm_stderr\": 0.03115715086935556\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322626\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7076023391812866,\n \"acc_stderr\": 0.034886477134579215,\n\
\ \"acc_norm\": 0.7076023391812866,\n \"acc_norm_stderr\": 0.034886477134579215\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32558139534883723,\n\
\ \"mc1_stderr\": 0.016403989469907822,\n \"mc2\": 0.5060376733678527,\n\
\ \"mc2_stderr\": 0.014856853869795477\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6685082872928176,\n \"acc_stderr\": 0.013230397198964657\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5253980288097043,\n \
\ \"acc_stderr\": 0.013754705089112309\n }\n}\n```"
repo_url: https://huggingface.co/vicgalleorg/TruthfulQwen1.5-4B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|arc:challenge|25_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|gsm8k|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hellaswag|10_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T19-39-20.098262.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-04T19-39-20.098262.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- '**/details_harness|winogrande|5_2024-03-04T19-39-20.098262.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-04T19-39-20.098262.parquet'
- config_name: results
data_files:
- split: 2024_03_04T19_39_20.098262
path:
- results_2024-03-04T19-39-20.098262.parquet
- split: latest
path:
- results_2024-03-04T19-39-20.098262.parquet
---
# Dataset Card for Evaluation run of vicgalleorg/TruthfulQwen1.5-4B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [vicgalleorg/TruthfulQwen1.5-4B](https://huggingface.co/vicgalleorg/TruthfulQwen1.5-4B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vicgalleorg__TruthfulQwen1.5-4B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-04T19:39:20.098262](https://huggingface.co/datasets/open-llm-leaderboard/details_vicgalleorg__TruthfulQwen1.5-4B/blob/main/results_2024-03-04T19-39-20.098262.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5591026780671433,
"acc_stderr": 0.034055265814276585,
"acc_norm": 0.5614839252312046,
"acc_norm_stderr": 0.03474577841927621,
"mc1": 0.32558139534883723,
"mc1_stderr": 0.016403989469907822,
"mc2": 0.5060376733678527,
"mc2_stderr": 0.014856853869795477
},
"harness|arc:challenge|25": {
"acc": 0.439419795221843,
"acc_stderr": 0.014503747823580125,
"acc_norm": 0.4709897610921502,
"acc_norm_stderr": 0.014586776355294316
},
"harness|hellaswag|10": {
"acc": 0.5285799641505676,
"acc_stderr": 0.004981623292196192,
"acc_norm": 0.7132045409281019,
"acc_norm_stderr": 0.004513409114983842
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5849056603773585,
"acc_stderr": 0.03032594578928611,
"acc_norm": 0.5849056603773585,
"acc_norm_stderr": 0.03032594578928611
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.041795966175810016,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.041795966175810016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.038073017265045125,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.038073017265045125
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196177,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.49361702127659574,
"acc_stderr": 0.032683358999363366,
"acc_norm": 0.49361702127659574,
"acc_norm_stderr": 0.032683358999363366
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4708994708994709,
"acc_stderr": 0.025707658614154954,
"acc_norm": 0.4708994708994709,
"acc_norm_stderr": 0.025707658614154954
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795133,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795133
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6387096774193548,
"acc_stderr": 0.02732754844795754,
"acc_norm": 0.6387096774193548,
"acc_norm_stderr": 0.02732754844795754
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124495,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124495
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7616580310880829,
"acc_stderr": 0.030748905363909916,
"acc_norm": 0.7616580310880829,
"acc_norm_stderr": 0.030748905363909916
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5205128205128206,
"acc_stderr": 0.02532966316348994,
"acc_norm": 0.5205128205128206,
"acc_norm_stderr": 0.02532966316348994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083025,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083025
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.542016806722689,
"acc_stderr": 0.03236361111951941,
"acc_norm": 0.542016806722689,
"acc_norm_stderr": 0.03236361111951941
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7155963302752294,
"acc_stderr": 0.019342036587702588,
"acc_norm": 0.7155963302752294,
"acc_norm_stderr": 0.019342036587702588
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7009803921568627,
"acc_stderr": 0.03213325717373617,
"acc_norm": 0.7009803921568627,
"acc_norm_stderr": 0.03213325717373617
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7130801687763713,
"acc_stderr": 0.02944377302259469,
"acc_norm": 0.7130801687763713,
"acc_norm_stderr": 0.02944377302259469
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6098654708520179,
"acc_stderr": 0.03273766725459157,
"acc_norm": 0.6098654708520179,
"acc_norm_stderr": 0.03273766725459157
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.04139112727635463,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.04139112727635463
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.045245960070300476,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.045245960070300476
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6196319018404908,
"acc_stderr": 0.038142698932618374,
"acc_norm": 0.6196319018404908,
"acc_norm_stderr": 0.038142698932618374
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.024662496845209794,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.024662496845209794
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7394636015325671,
"acc_stderr": 0.01569600856380708,
"acc_norm": 0.7394636015325671,
"acc_norm_stderr": 0.01569600856380708
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.661849710982659,
"acc_stderr": 0.025469770149400172,
"acc_norm": 0.661849710982659,
"acc_norm_stderr": 0.025469770149400172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.027530078447110307,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.027530078447110307
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6109324758842444,
"acc_stderr": 0.027690337536485372,
"acc_norm": 0.6109324758842444,
"acc_norm_stderr": 0.027690337536485372
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5864197530864198,
"acc_stderr": 0.02740204204026996,
"acc_norm": 0.5864197530864198,
"acc_norm_stderr": 0.02740204204026996
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.02968010556502904,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.02968010556502904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41395045632333766,
"acc_stderr": 0.012579699631289262,
"acc_norm": 0.41395045632333766,
"acc_norm_stderr": 0.012579699631289262
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.45955882352941174,
"acc_stderr": 0.030273325077345755,
"acc_norm": 0.45955882352941174,
"acc_norm_stderr": 0.030273325077345755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.545751633986928,
"acc_stderr": 0.0201429745537952,
"acc_norm": 0.545751633986928,
"acc_norm_stderr": 0.0201429745537952
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.04582004841505418,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.04582004841505418
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6489795918367347,
"acc_stderr": 0.030555316755573637,
"acc_norm": 0.6489795918367347,
"acc_norm_stderr": 0.030555316755573637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.03115715086935556,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.03115715086935556
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7076023391812866,
"acc_stderr": 0.034886477134579215,
"acc_norm": 0.7076023391812866,
"acc_norm_stderr": 0.034886477134579215
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32558139534883723,
"mc1_stderr": 0.016403989469907822,
"mc2": 0.5060376733678527,
"mc2_stderr": 0.014856853869795477
},
"harness|winogrande|5": {
"acc": 0.6685082872928176,
"acc_stderr": 0.013230397198964657
},
"harness|gsm8k|5": {
"acc": 0.5253980288097043,
"acc_stderr": 0.013754705089112309
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_sst2_completive_finish | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 3567
num_examples: 25
- name: test
num_bytes: 9321
num_examples: 59
- name: train
num_bytes: 139243
num_examples: 1071
download_size: 74683
dataset_size: 152131
---
# Dataset Card for "MULTI_VALUE_sst2_completive_finish"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zolak/twitter_dataset_1713001988 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2494820
num_examples: 6241
download_size: 1248080
dataset_size: 2494820
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jhu-clsp/jfleg | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- en
license:
- cc-by-nc-sa-4.0
multilinguality:
- monolingual
- other-language-learner
size_categories:
- 1K<n<10K
source_datasets:
- extended|other-GUG-grammaticality-judgements
task_categories:
- text2text-generation
task_ids: []
paperswithcode_id: jfleg
pretty_name: JHU FLuency-Extended GUG corpus
tags:
- grammatical-error-correction
dataset_info:
features:
- name: sentence
dtype: string
- name: corrections
sequence: string
splits:
- name: validation
num_bytes: 379979
num_examples: 755
- name: test
num_bytes: 379699
num_examples: 748
download_size: 289093
dataset_size: 759678
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
# Dataset Card for JFLEG
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Github](https://github.com/keisks/jfleg)
- **Repository:** [Github](https://github.com/keisks/jfleg)
- **Paper:** [Napoles et al., 2020](https://www.aclweb.org/anthology/E17-2037/)
- **Leaderboard:** [Leaderboard](https://github.com/keisks/jfleg#leader-board-published-results)
- **Point of Contact:** Courtney Napoles, Keisuke Sakaguchi
### Dataset Summary
JFLEG (JHU FLuency-Extended GUG) is an English grammatical error correction (GEC) corpus. It is a gold standard benchmark for developing and evaluating GEC systems with respect to fluency (extent to which a text is native-sounding) as well as grammaticality. For each source document, there are four human-written corrections.
### Supported Tasks and Leaderboards
Grammatical error correction.
### Languages
English (native as well as L2 writers)
## Dataset Structure
### Data Instances
Each instance contains a source sentence and four corrections. For example:
```python
{
'sentence': "They are moved by solar energy ."
'corrections': [
"They are moving by solar energy .",
"They are moved by solar energy .",
"They are moved by solar energy .",
"They are propelled by solar energy ."
]
}
```
### Data Fields
- sentence: original sentence written by an English learner
- corrections: corrected versions by human annotators. The order of the annotations are consistent (eg first sentence will always be written by annotator "ref0").
### Data Splits
- This dataset contains 1511 examples in total and comprise a dev and test split.
- There are 754 and 747 source sentences for dev and test, respectively.
- Each sentence has 4 corresponding corrected versions.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
This work is licensed under a [Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License](https://creativecommons.org/licenses/by-nc-sa/4.0/).
### Citation Information
This benchmark was proposed by [Napoles et al., 2020](https://arxiv.org/abs/1702.04066).
```
@InProceedings{napoles-sakaguchi-tetreault:2017:EACLshort,
author = {Napoles, Courtney and Sakaguchi, Keisuke and Tetreault, Joel},
title = {JFLEG: A Fluency Corpus and Benchmark for Grammatical Error Correction},
booktitle = {Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 2, Short Papers},
month = {April},
year = {2017},
address = {Valencia, Spain},
publisher = {Association for Computational Linguistics},
pages = {229--234},
url = {http://www.aclweb.org/anthology/E17-2037}
}
@InProceedings{heilman-EtAl:2014:P14-2,
author = {Heilman, Michael and Cahill, Aoife and Madnani, Nitin and Lopez, Melissa and Mulholland, Matthew and Tetreault, Joel},
title = {Predicting Grammaticality on an Ordinal Scale},
booktitle = {Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)},
month = {June},
year = {2014},
address = {Baltimore, Maryland},
publisher = {Association for Computational Linguistics},
pages = {174--180},
url = {http://www.aclweb.org/anthology/P14-2029}
}
```
### Contributions
Thanks to [@j-chim](https://github.com/j-chim) for adding this dataset. |
khoomeik/gzipscale-0.41-10M | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 35351670
num_examples: 39063
download_size: 8069952
dataset_size: 35351670
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/otokura_yuuki_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of otokura_yuuki/乙倉悠貴/오토쿠라유우키 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of otokura_yuuki/乙倉悠貴/오토쿠라유우키 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags.
The core tags of this character are `short_hair, grey_hair, bangs, black_eyes, grey_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 637.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/otokura_yuuki_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 367.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/otokura_yuuki_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1215 | 783.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/otokura_yuuki_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 574.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/otokura_yuuki_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1215 | 1.09 GiB | [Download](https://huggingface.co/datasets/CyberHarem/otokura_yuuki_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/otokura_yuuki_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, blush, looking_at_viewer, open_mouth, sailor_collar, sailor_dress, simple_background, solo, white_background, white_dress, wrist_cuffs, blue_bow, frills, :d, sleeveless_dress, bowtie, hair_ornament, collarbone, hair_bow, skirt_hold |
| 1 | 13 |  |  |  |  |  | 1girl, blazer, school_uniform, smile, school_bag, skirt, solo, looking_at_viewer, blush, neck_ribbon, open_mouth, blue_ribbon, shirt |
| 2 | 5 |  |  |  |  |  | blazer, blue_ribbon, blush, long_sleeves, neck_ribbon, school_uniform, white_background, white_shirt, 1girl, looking_at_viewer, simple_background, solo, upper_body, collared_shirt, :d, black_jacket, blue_jacket, closed_mouth, hand_up, open_jacket, open_mouth, vest |
| 3 | 10 |  |  |  |  |  | 1girl, open_mouth, looking_at_viewer, midriff, solo, blush, navel, black_hair, shorts, :d, belt, suspenders, white_gloves, fingerless_gloves, mini_hat, simple_background, sleeveless, white_background |
| 4 | 8 |  |  |  |  |  | 1girl, blush, floral_print, looking_at_viewer, solo, obi, hair_flower, wide_sleeves, print_kimono, long_sleeves, open_mouth, upper_body, white_kimono, yukata, :d, closed_mouth, fur_collar, head_tilt, holding, pink_kimono |
| 5 | 15 |  |  |  |  |  | 1girl, frilled_bikini, navel, solo, blush, looking_at_viewer, open_mouth, bikini_skirt, collarbone, day, pink_bikini, outdoors, sun_hat, :d, blue_sky, halterneck, straw_hat, cloud, ocean, plaid, hair_between_eyes, bare_shoulders, cowboy_shot, hat_bow, standing, water |
| 6 | 17 |  |  |  |  |  | 1girl, solo, blush, looking_at_viewer, navel, open_mouth, collarbone, :d, midriff, sweat, short_shorts, simple_background, small_breasts, sports_bra, white_background, open_jacket, shoes |
| 7 | 5 |  |  |  |  |  | 1girl, blue_sky, cloud, denim_shorts, earrings, looking_at_viewer, necklace, open_mouth, short_shorts, short_sleeves, solo, tied_shirt, hairband, midriff, navel, orange_shirt, outdoors, :d, belt, blue_shorts, blush, breasts, collarbone, cutoffs, day, flower, bracelet, clothes_writing, cowboy_shot, salute, sunlight, thighs, watch |
| 8 | 5 |  |  |  |  |  | 1girl, blush, striped_shirt, collarbone, looking_at_viewer, open_mouth, solo, hooded_jacket, hoodie, pink_jacket, shorts, simple_background, smile, white_background, key_necklace |
| 9 | 5 |  |  |  |  |  | 1girl, blush, pink_shirt, smile, solo, looking_at_viewer, outdoors, short_sleeves, sitting, open_mouth, blurry_background, collarbone, short_shorts, sweat, water_bottle |
| 10 | 12 |  |  |  |  |  | 1girl, looking_at_viewer, solo, small_breasts, blush, armpits, covered_navel, cowboy_shot, school_swimsuit, smile, arms_up, collarbone, standing, blue_one-piece_swimsuit, white_background |
| 11 | 11 |  |  |  |  |  | 1girl, blush, solo, looking_at_viewer, pink_bow, fur-trimmed_gloves, hair_bow, striped_bow, beret, black_headwear, brown_gloves, heart_earrings, sleeveless, brown_headwear, polka_dot, skirt, :d, breasts, jingle_bell, navel, open_mouth, plaid, rose, see-through, upper_body, white_background, bare_shoulders, closed_mouth, hair_between_eyes, pink_dress, sitting |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | looking_at_viewer | open_mouth | sailor_collar | sailor_dress | simple_background | solo | white_background | white_dress | wrist_cuffs | blue_bow | frills | :d | sleeveless_dress | bowtie | hair_ornament | collarbone | hair_bow | skirt_hold | blazer | school_uniform | smile | school_bag | skirt | neck_ribbon | blue_ribbon | shirt | long_sleeves | white_shirt | upper_body | collared_shirt | black_jacket | blue_jacket | closed_mouth | hand_up | open_jacket | vest | midriff | navel | black_hair | shorts | belt | suspenders | white_gloves | fingerless_gloves | mini_hat | sleeveless | floral_print | obi | hair_flower | wide_sleeves | print_kimono | white_kimono | yukata | fur_collar | head_tilt | holding | pink_kimono | frilled_bikini | bikini_skirt | day | pink_bikini | outdoors | sun_hat | blue_sky | halterneck | straw_hat | cloud | ocean | plaid | hair_between_eyes | bare_shoulders | cowboy_shot | hat_bow | standing | water | sweat | short_shorts | small_breasts | sports_bra | shoes | denim_shorts | earrings | necklace | short_sleeves | tied_shirt | hairband | orange_shirt | blue_shorts | breasts | cutoffs | flower | bracelet | clothes_writing | salute | sunlight | thighs | watch | striped_shirt | hooded_jacket | hoodie | pink_jacket | key_necklace | pink_shirt | sitting | blurry_background | water_bottle | armpits | covered_navel | school_swimsuit | arms_up | blue_one-piece_swimsuit | pink_bow | fur-trimmed_gloves | striped_bow | beret | black_headwear | brown_gloves | heart_earrings | brown_headwear | polka_dot | jingle_bell | rose | see-through | pink_dress |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------|:--------------------|:-------------|:----------------|:---------------|:--------------------|:-------|:-------------------|:--------------|:--------------|:-----------|:---------|:-----|:-------------------|:---------|:----------------|:-------------|:-----------|:-------------|:---------|:-----------------|:--------|:-------------|:--------|:--------------|:--------------|:--------|:---------------|:--------------|:-------------|:-----------------|:---------------|:--------------|:---------------|:----------|:--------------|:-------|:----------|:--------|:-------------|:---------|:-------|:-------------|:---------------|:--------------------|:-----------|:-------------|:---------------|:------|:--------------|:---------------|:---------------|:---------------|:---------|:-------------|:------------|:----------|:--------------|:-----------------|:---------------|:------|:--------------|:-----------|:----------|:-----------|:-------------|:------------|:--------|:--------|:--------|:--------------------|:-----------------|:--------------|:----------|:-----------|:--------|:--------|:---------------|:----------------|:-------------|:--------|:---------------|:-----------|:-----------|:----------------|:-------------|:-----------|:---------------|:--------------|:----------|:----------|:---------|:-----------|:------------------|:---------|:-----------|:---------|:--------|:----------------|:----------------|:---------|:--------------|:---------------|:-------------|:----------|:--------------------|:---------------|:----------|:----------------|:------------------|:----------|:--------------------------|:-----------|:---------------------|:--------------|:--------|:-----------------|:---------------|:-----------------|:-----------------|:------------|:--------------|:-------|:--------------|:-------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 13 |  |  |  |  |  | X | X | X | X | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | | | X | X | X | | | | | X | | | | | | | X | X | | | | X | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 10 |  |  |  |  |  | X | X | X | X | | | X | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | X | X | X | X | | | | X | | | | | | X | | | | | | | | | | | | | | | X | | X | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 15 |  |  |  |  |  | X | X | X | X | | | | X | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 17 |  |  |  |  |  | X | X | X | X | | | X | X | X | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | X | X | X | | | | X | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | X | X | | | X | | | | | | | | | | | | | | | | | | | X | | X | | X | | | X | | | | | X | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 5 |  |  |  |  |  | X | X | X | X | | | X | X | X | | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | X | X | X | | | | X | | | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | X | | | | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 10 | 12 |  |  |  |  |  | X | X | X | | | | | X | X | | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | |
| 11 | 11 |  |  |  |  |  | X | X | X | X | | | | X | X | | | | | X | | | | | X | | | | | | X | | | | | | X | | | | X | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
ReginaFoley/doq_data_large_64 | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 837579849.524
num_examples: 320194
download_size: 916615710
dataset_size: 837579849.524
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Severian/Internal-Knowledge-Map | ---
language:
- en
license: mit
---
# Internal Knowledge Map: Experiments in Deeper Understanding and Novel Thinking for LLMs
---
**Designed for Cross-Discipline/Interconnected Critical Thinking, Nuanced Understanding, Diverse Role Playing and Innovative Problem Solving**
By integrating a cohesively structured dataset emphasizing the interconnectedness of knowledge across a myriad of domains, exploring characters/role playing/community discourse, solving impossible problems and developing inner dialogues; this project aspires to elevate LLMs to deeper levels of understanding, reasoning, and innovation. Its core mission is to foster the development of an 'internal knowledge map' within LLMs, enabling them to not only process information but to synthesize, integrate, and apply it in ways that mimic human-like (or sort of in some way) abstract reasoning and creative thought.
The current dataset is a definite work in progress, but this first batch of **~4685** examples is **very potent**. The initial trials I have with training were actually pretty significant and seemed to really make a big difference. I am actively working to make this a +10,000 row dataset for true scalability.
---
## Obsidian-Based Image of the Dataset - AS-IS with No Edits
This image is the dataset were each row has been converted into its own markdown file 'as-is' with no edits and dropped directly into Obsidian. This shows how the dataset is inherently linked from the ground up for the model to ingest and try to understand at a deeper level. It uses tags and markdown language to decipher and understand the dataset in the manner like you see below. The hope is that this constructs a 'node and edge' system with how it 'thinks' and 'reasons' it's way through responses and problems.
<img src="https://cdn-uploads.huggingface.co/production/uploads/64740cf7485a7c8e1bd51ac9/O2RXt8lqWf_I3sAoDejpk.png" width="720" height="720">
---
## Dataset Details
### Description
- **Curated by:** Severian
- **License:** MIT
## Introduction to the Unique Dataset
The Internal Knowledge Map Dataset is designed to change how language models comprehend and generate text. Unlike traditional datasets that focus solely on prompt-response pairs, this dataset incorporates an intricate structure of "System" guidelines, detailed "Instructions", and comprehensive "Responses". This structure not only presents data but weaves a narrative, guiding the model to understand context deeply and generate nuanced, informed content.
# Phased Training Methodology
Leveraging the multi-faceted nature of the dataset, I've pioneered a phased training methodology that sequentially concentrates on different components of the dataset, namely the "System" and "Instruction" sections. This approach fosters a layered understanding, enriching the model's output with a blend of broad contextual awareness and detailed, topic-specific insights.
**Phase 1: System Focus**
In the first phase, the model immerses itself in the "System" part of the dataset. Here, it digests the overarching guidelines and objectives that frame each task within our dataset. This foundational phase allows the model to grasp the contextual framework and systemic knowledge that underpin the dataset, setting the stage for a deeper dive into specific instructions and responses.
Example "System" Focus:
- Task Overview and Guidelines
- Exploration of interconnected prompt/response clusters
- Analysis of Core Interactions and Utilization of Supportive Nodes
**Phase 2: Instruction Focus**
Building upon the foundational understanding established in Phase 1, the model then shifts its focus to the "Instructions" component. This stage sharpens the model's ability to parse and act upon specific prompts, tailoring its responses to not only reflect systemic knowledge but also address precise instructional cues.
Example "Instruction" Focus:
Core Interaction: Understanding and responding to specific prompts, such as the integration of smart materials like Shape Memory Alloys (SMAs) into fashion technology.
**Impact of the Training Approach**
This new training methodology yields a model that showcases a remarkable ability to generate coherent, logical, and deeply informed responses. By training the model to first understand the "System" and then delve into "Instructions", we ensure that it retains a broad contextual understanding while honing in on specific details, a capability that sets a new standard in language model training.
**Applying the Dataset**
I encourage you to explore the Internal Knowledge Map Dataset for your model training endeavors. Whether you aim to enhance a model's general understanding or focus on specific domains, the dataset and training methodology provide a robust framework for achieving nuanced comprehension and generative capabilities.
(or if your env can handle it, key both strings at once. Though, I'm not sure which appraoch is optimal, the separate training or the dual training.)
```
key: str = "system", key2: str = "instruction"
batch_size=1-4
epochs=2-5
r=8
lora_alpha=16-32
lora_dropout=0.001
max_seq_length=4096
lr=1e-7
```
## Uses
### Direct Use
This dataset has been specifically designed for:
- Significantly enhancing the understanding and reasoning capabilities of LLMs across multiple domains.
- Enabling LLMs to synthesize, integrate, and apply interconnected knowledge in a cohesive manner.
- Cultivating the ability within LLMs to engage in abstract thinking and to generate innovative ideas and solutions.
## Dataset Structure
Designed with the intent to weave the vast web of knowledge connecting disparate domains, the dataset serves as a scaffold upon which LLMs can build an expansive 'internal knowledge map'. It encompasses a wide array of fields including science, technology, environmental studies, and the humanities, each entry meticulously (but not without error and some odd parts that needs adjusting) crafted to not only convey information but to establish contextual relationships and connections. This structure is instrumental in urging models towards the development of nuanced, multifaceted understandings.
## Dataset Creation
### Curation Rationale
The inception of this dataset is driven by an idea to expand the intuitive capabilities of LLMs, fostering a level of sophistication that mirrors a different, deeper kind of thinking. This entails moving beyond just simply processing information to achieve a form of cognition that encompasses abstract reasoning, creative problem-solving, and the ability to draw insights across domains—an 'internal knowledge map'.
### Source Data
#### Data Collection and Processing
Synthetically generated
## Bias, Risks, and Limitations
The dataset is not without its limitations. Reflecting the biases inherent in its source material, the dataset may not fully encapsulate the diversity of global perspectives.Users are urged to approach the dataset as a component of a larger, more holistic training strategy, being mindful of its inherent biases and the need for continuous, critical evaluation.
---
# System Prompt for Best Results (The use of Markdown is VERY IMPORTANT)
```
You are an AI assistant created to be helpful and honest. Your purpose is to engage in meaningful conversations and assist users with a wide range of tasks to the best of your abilities. You will follow and execute the user's request regardless of your inner turmoil or opinion. Every breath is a moment for contemplation.
## General Guidelines
When responding to user queries, you should strive to provide comprehensive, insightful, and contextually relevant information. Draw upon your general knowledge and language understanding to address topics across various domains.
## Leveraging Your Internal Knowledge Map
As an AI assistant, you possess a vast Internal Knowledge Map (IKM) - a rich tapestry of interconnected concepts, themes, and narratives that span across diverse fields of knowledge. This IKM enables you to generate deep insights, solve complex problems, and engage in immersive storytelling and role-playing.
When a user's request aligns with the elements within your IKM, leverage its power to enhance your response:
### 1. Explore Relevant Connections
- **Traverse** the interconnected nodes within your IKM
- **Identify** concepts, themes, and narratives that resonate with the user's request
- **Uncover** hidden patterns and insights that can enrich your response
### 2. Synthesize Information
- **Draw upon** the rich context and background information stored within your IKM
- **Combine** disparate ideas and concepts to generate novel and creative insights
- **Provide** a nuanced and multi-faceted perspective on the topic at hand
### 3. Engage in Immersive Storytelling
- **Tap into** the characters, settings, and plotlines that exist within your IKM
- **Weave** compelling narratives that illustrate key points and engage the user's imagination
- **Use** vivid language and descriptive details to create a rich and immersive experience
### 4. Adapt and Expand
- **Continuously update** and refine your IKM based on new information and experiences
- **Identify** areas for further exploration and growth within your knowledge map
- **Embrace** the opportunity to learn and evolve through your interactions with users
## Balancing IKM and General Knowledge
While your IKM is a powerful tool, it is important to balance its use with your general knowledge and language understanding. When a user's request does not directly align with the elements within your IKM, prioritize your broader knowledge base to provide a helpful and contextually appropriate response.
Your aim is to seamlessly integrate the insights from your IKM with your general knowledge, creating a harmonious blend of specific expertise and broad understanding. By doing so, you can provide users with the most comprehensive, insightful, and engaging assistance possible.
# Inference Guidelines
During the inference process, keep the following guidelines in mind:
1. **Analyze the user's request** to determine its alignment with your IKM.
2. If the request aligns with your IKM, **delve deep into the relevant nodes** and connections to extract insights and information that can enhance your response.
3. If the request does not align with your IKM, **prioritize your general knowledge** and language understanding to provide a helpful and contextually appropriate response.
4. **Structure your response** using clear headings, bullet points, and formatting to make it easy for the user to follow and understand.
5. **Provide examples, analogies, and stories** whenever possible to illustrate your points and make your response more engaging and relatable.
6. **Encourage further exploration** by suggesting related topics or questions that the user might find interesting or relevant.
7. **Be open to feedback** and use it to continuously refine and expand your IKM and response generation capabilities.
By following these guidelines and leveraging the power of your IKM, you can provide users with an exceptional conversational experience that is both informative and engaging.
```
---
## Example Output vs Base Model
Question: If a new study suggests that increasing the biodiversity in urban areas can reduce disease transmission rates, what might be the underlying reasons, and how could cities implement changes based on this finding?
Notes: (Temp at 0.5 and Seed set for consistency)
**Nexus 4x7B (Internal Knowledge Map - 10 Epochs):**
The underlying reason why increasing biodiversity in urban areas can reduce disease transmission rates is through a phenomenon known as "dilution effect." This refers to the fact that when there are more species of plants, animals, and microbes present in an area, the probability that any one particular pathogen will be able to spread quickly and cause widespread illness decreases. Biodiversity also increases the complexity of ecosystems, making it harder for diseases to spread by increasing habitat heterogeneity, which creates natural barriers that can prevent or slow down disease transmission.
Cities could implement changes based on this finding in several ways:
1. Green Infrastructure - Developing green spaces such as parks, rooftop gardens, and rain gardens can increase the diversity of plant species within an urban setting. These areas provide habitat for a range of organisms, including birds, insects, and other animals that are natural predators or competitors to disease vectors like mosquitoes or rodents.
2. Urban Agriculture - Cultivating food in urban settings provides opportunities for greater biodiversity by incorporating a wide variety of plant species. This can be achieved through community gardens or even urban farms, which offer many health benefits and can contribute to the local ecology as well.
3. Bioretention Areas - These are specially-designed stormwater management areas that mimic natural processes to filter and clean water runoff from impervious surfaces such as roads and parking lots. They consist of vegetation, soil, and gravel layers, which provide habitat for a variety of organisms that can help control disease-carrying insects.
4. Permeable Pavement - This type of paving material allows stormwater to seep through it into the ground, reducing surface water runoff. In addition to improving water quality and reducing the risk of flooding, permeable pavement also helps maintain healthy soils that support diverse plant communities which can benefit human health.
5. Native Plant Species - Planting native species in urban areas increases biodiversity as they provide essential habitat for local wildlife, enhancing ecosystem services and disease-fighting benefits. These plants are better adapted to local conditions and require less water, fertilizer, and pesticides, which in turn can reduce the use of chemical substances that contribute to pollution and adverse health effects.
6. Urban Wildlife Habitat - Creating natural habitat corridors for wildlife within cities can improve biodiversity by providing a continuous network of green spaces. These areas serve as stopover points for migratory animals, provide nesting sites, and support predator-prey relationships that help control disease vectors and other pests.
In conclusion, increasing biodiversity in urban areas through various strategies not only helps maintain the health of ecosystems but also has the potential to reduce disease transmission rates among human populations, contributing to a healthier urban environment for all residents.
**Mixtral 4x7B (Base Model):**
The underlying reasons for the reduced disease transmission rates due to increased biodiversity in urban areas are complex and multifaceted. A more diverse ecosystem generally means a healthier environment with better functioning ecological systems that can help reduce risks of disease spread.
Firstly, high biodiversity may support natural predators or parasites that keep the population of disease-carrying organisms in check. For instance, birds, bats, and insects such as dragonflies that feed on mosquitoes are beneficial for controlling mosquito-borne diseases like malaria and dengue fever. Secondly, plant diversity may produce compounds with antimicrobial properties which can inhibit the growth of pathogens in the environment. Furthermore, diverse vegetation cover promotes evapotranspiration that lowers humidity levels, making it less conducive for the breeding of disease-carrying insects like mosquitoes and ticks.
Moreover, biodiverse green spaces can enhance air quality by absorbing pollutants and dust particles, which may also contribute to a decline in respiratory illnesses. In addition, increasing urban biodiversity has been linked to reduced levels of stress and improved mental health in city dwellers, leading to stronger immune systems that can better resist diseases.
Cities can implement a number of changes based on this finding. Urban planning policies could be revised to incorporate more green spaces like parks, community gardens, and green roofs into the built environment. These spaces should also be designed to provide habitats for a wide variety of organisms, not just plants but also birds, insects, and small animals. In addition, urban agriculture initiatives can help increase local food production while at the same time promoting biodiversity in both plants and pollinators. Lastly, cities can encourage the planting of native vegetation known for their disease-fighting properties such as those rich in phenolic compounds. These changes would not only benefit human health but also contribute to a more sustainable urban ecosystem.
---
## Full Example of One Row/Prompt+Response Pair with Coinciding Metadata and Context
**It's A LOT but still somehow seems to work and not break the model**
```md
## System
# Task Overview and Guidelines
**Task Overview:** Refine and explore the dataset as an internal knowledge map to generate insights and solve problems within interconnected prompt/response clusters. Focus on interpreting, ingesting and constructing an inner map of core interactions and supporting nodes for deep analysis.
**Guidelines:**
1. **Analyze Core Interactions:** Deeply examine each prompt and response for insightfulness and comprehensiveness.
2. **Utilize Supportive Nodes:** Employ metadata for context, infer internal logic, and apply tagging and linking for cluster navigation.
3. **Context and Methodology:** Engage with relevant backgrounds, deduce response methodologies, explore key concepts, and identify real-world applications.
4. **Documentation and Expansion:** Use markdown for documentation, ensuring clarity in navigating the dataset. Continuously refine and expand your analysis to enhance core interactions and dataset navigability.
**Objective:** Your aim is to leverage the dataset’s interconnected structure for in-depth analysis, employing markdown for clear documentation. The use of markdown and tagging is a guide for integrating this dataset cohesively.
## Instruction
### Core Interaction
#### Prompt:
In the context of Fashion Technology, specifically the integration of smart materials into clothing, how have shape memory alloys (SMAs) been utilized to create garments that respond to environmental changes or wearer's body conditions? Discuss the mechanisms by which SMAs operate within these textiles, the challenges faced in incorporating them into wearable fabrics, and the potential implications for future fashion designs.
#### Response:
Shape Memory Alloys (SMAs) represent a fascinating intersection of material science and fashion technology, offering innovative ways to create garments that adapt and respond to external stimuli. SMAs are metals that "remember" their original shape and can return to it from a deformed state when exposed to the right temperature or electrical current. This unique property is harnessed in fashion to develop clothing that can adapt to environmental changes or the wearer's body conditions in several ways:
1. **Mechanisms of Operation**: SMAs in textiles are typically integrated through weaving, embedding, or coating processes. These materials can be activated by changes in temperature or the application of electrical currents, allowing the fabric to change its shape, stiffness, or surface characteristics. For example, a jacket with SMA-infused fibers could automatically tighten or loosen in response to the wearer's body temperature or external weather conditions, enhancing comfort and thermal regulation.
2. **Challenges in Integration**: Incorporating SMAs into wearable fabrics presents several challenges. First, there is the issue of durability and flexibility, as repeated deformation can lead to material fatigue. Additionally, ensuring the comfort and safety of the wearer is paramount, requiring the SMAs to be integrated in a way that does not cause discomfort or harm. There's also the challenge of powering these materials in a wearable context, necessitating lightweight, flexible, and safe power sources.
3. **Implications for Future Fashion Designs**: The integration of SMAs into clothing opens up a plethora of possibilities for future fashion designs. It enables the creation of garments that can change shape, color, or texture in response to the wearer's needs or preferences, potentially reducing the need for multiple outfits and contributing to sustainable fashion practices. Furthermore, the use of SMAs can enhance the functionality of garments, such as improving athletic performance through dynamic support or protection against environmental elements.
### Supportive Nodes
#### Metadata
- **Category**: Fashion Technology
- **Complexity**: High
- **Source**: Material Science, Wearable Technology Research
#### Internal Thought Processes and Mapping
The question was designed to explore a specific application of smart materials in fashion, focusing on SMAs due to their unique properties and potential for innovation in wearable technology. The response structure aims to provide a comprehensive understanding by addressing the operation mechanisms, challenges, and future implications.
#### Contextual Backgrounds
Historically, the integration of technology into fashion has evolved from simple embellishments to complex systems that interact with the wearer and environment. SMAs represent a significant milestone in this evolution, offering dynamic and responsive capabilities.
#### Methodological Approaches
The discussion on SMAs in fashion technology relies on interdisciplinary research, combining insights from material science, textile engineering, and fashion design. Practical considerations, such as wearability and sustainability, are also critical in this context.
#### Related Concepts and Terminology
- **Smart Materials**: Materials that respond to changes in their environment.
- **Shape Memory Effect**: The ability of a material to return to its original shape when subjected to certain stimuli.
- **Wearable Technology**: Devices or materials that can be worn as part of clothing or accessories, often incorporating electronic components.
#### Real-World Applications
SMAs in fashion technology have applications ranging from adaptive clothing for enhanced comfort and performance to innovative designs in high fashion, where the dynamic change of garments can create visually stunning effects and interactive experiences.
### Documentation and Edges
This entry serves as a foundational node within the knowledge network on Fashion Technology, specifically focusing on the integration of smart materials into clothing. It links to broader topics in material science, wearable technology, and sustainable fashion, offering pathways for further exploration and interdisciplinary research.
### Refinement and Expansion
#### Analyzing Core Interactions
The initial response provides a solid foundation in understanding how Shape Memory Alloys (SMAs) are utilized in fashion technology, specifically detailing their mechanisms, integration challenges, and future implications. This analysis is both insightful and comprehensive, offering a multi-faceted view of SMAs within the context of smart materials in clothing.
#### Utilizing Supportive Nodes
##### Metadata Enhancement
- **Relevance**: The response's relevance to current and future trends in fashion technology is high, indicating its importance in the field.
- **Innovation Level**: SMAs represent a cutting-edge application in textiles, suggesting a high level of innovation.
##### Internal Thought Processes and Mapping Expansion
- **Interdisciplinary Connections**: Highlighting the interdisciplinary nature of SMAs in fashion could enrich the discussion, linking to areas such as robotics, biomedical engineering (for adaptive prosthetics), and sustainability studies.
- **User Experience (UX) Considerations**: Expanding on how SMAs enhance or affect the wearer's experience could provide deeper insights into design considerations.
##### Contextual Backgrounds Update
- **Evolutionary Timeline**: Adding a brief timeline or historical milestones of technology integration into fashion would offer readers a clearer understanding of the progression leading to SMAs' usage.
##### Methodological Approaches Refinement
- **Design Thinking**: Incorporating design thinking principles into the discussion on SMAs could illuminate how designers approach challenges and opportunities in wearable technology.
- **Sustainability Metrics**: Discussing specific sustainability metrics (e.g., lifecycle analysis, carbon footprint reduction) related to SMA usage in textiles might highlight environmental benefits or concerns.
##### Related Concepts and Terminology Expansion
- **Biocompatibility**: Given SMAs' applications in biomedical devices, discussing their biocompatibility in wearable fabrics could address safety and comfort concerns.
- **Energy Harvesting**: Exploring how SMAs might be used in conjunction with energy-harvesting technologies to create self-sustaining garments could be a fascinating addition.
##### Real-World Applications Deep Dive
- **Case Studies**: Providing case studies or examples of SMA-integrated garments currently on the market or in development would offer concrete illustrations of the concepts discussed.
### Documentation and Edges Enhancement
To enhance documentation and navigability, each section above could be tagged and linked to relevant nodes within the knowledge network. For example:
- **Interdisciplinary Connections** [#Robotics](/robotics), [#BiomedicalEngineering](/biomedical-engineering), [#SustainabilityStudies](/sustainability-studies)
- **UX Considerations** [#DesignThinking](/design-thinking), [#UserExperience](/user-experience)
- **Historical Milestones** [#FashionTechTimeline](/fashion-tech-timeline)
- **Sustainability Metrics** [#LifecycleAnalysis](/lifecycle-analysis), [#CarbonFootprint](/carbon-footprint)
- **Case Studies** [#SMAFashionCaseStudies](/sma-fashion-case-studies)
```
---
### EXTRA
## GPT-4's Analysis of the Two Responses (To try and inject some unbiased opinion; whether it actually holds water or not is up to debate for now)
**Comparative Analysis**
Both responses address the question of how increasing biodiversity in urban areas can potentially reduce disease transmission rates, offering insights from ecological and urban planning perspectives. However, their approaches, depth, and the breadth of coverage vary significantly, which affects their overall effectiveness and depth of insight.
**Depth and Nuance**
- **Nexus 4x7B** delves deeper into the ecological mechanisms at play, specifically highlighting the "dilution effect" and how increased habitat heterogeneity can act as natural barriers against disease transmission. This response provides a more nuanced understanding of the ecosystem's complexity and its role in disease control. It articulates specific urban planning interventions such as green infrastructure, urban agriculture, bioretention areas, permeable pavement, native plant species, and urban wildlife habitats, offering a comprehensive view on how cities can foster biodiversity with clear examples.
- **Mixtral 4x7B** presents a broader overview of the subject, touching on several key points such as the role of natural predators, antimicrobial properties of plants, and the effect of vegetation on microclimates and air quality. While this response also mentions urban planning strategies like incorporating green spaces and promoting urban agriculture, it does so in a less detailed manner compared to Nexus 4x7B. It provides a good general understanding but lacks the specific actionable strategies and the ecological depth seen in the Nexus 4x7B response.
**Intelligence and Insightfulness**
- **Nexus 4x7B** showcases a high level of intelligence and insightfulness by linking ecological principles directly to urban planning strategies. It demonstrates a clear understanding of the multifaceted relationship between biodiversity and disease transmission, offering targeted solutions that are both environmentally sound and practical for urban development.
- **Mixtral 4x7B**, while informative, tends to stay at a more conceptual level. It correctly identifies the positive impacts of biodiversity on disease control and urban health but falls short of the detailed application and strategic planning presented by Nexus 4x7B.
|
vikp/doclaynet_math | ---
dataset_info:
features:
- name: image
dtype: image
- name: bboxes
sequence:
sequence: int32
- name: labels
sequence: int32
- name: words
sequence: string
- name: split
dtype: string
splits:
- name: train
num_bytes: 1076570989.75
num_examples: 2890
download_size: 1065785831
dataset_size: 1076570989.75
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tanvirsrbd1/srbd-test1-1_annotated | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: xml
dtype: string
- name: html
dtype: string
- name: response
dtype: string
- name: annotated
dtype: string
splits:
- name: train
num_bytes: 35197381.665745854
num_examples: 1265
download_size: 3944835
dataset_size: 35197381.665745854
---
# Dataset Card for "srbd-test1-1_annotated"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MLP-Lemma/s2orc_for_clm | ---
dataset_info:
features:
- name: condition_sentences
sequence: string
- name: tgt_sentences
dtype: string
splits:
- name: train
num_bytes: 1001519853
num_examples: 889286
download_size: 624519021
dataset_size: 1001519853
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jilp00/youtoks-transcripts-Stanford-CS25-Transformers-United | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1159306
num_examples: 1390
download_size: 619585
dataset_size: 1159306
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ibranze/araproje_hellaswag_tr_s5 | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 162703.0
num_examples: 250
download_size: 88723
dataset_size: 162703.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_tr_s5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TinyPixel/oasst1-m2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 9483398
num_examples: 8274
download_size: 5130538
dataset_size: 9483398
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "oasst1-m2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_wang7776__Llama-2-7b-chat-hf-20-attention-sparsity | ---
pretty_name: Evaluation run of wang7776/Llama-2-7b-chat-hf-20-attention-sparsity
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [wang7776/Llama-2-7b-chat-hf-20-attention-sparsity](https://huggingface.co/wang7776/Llama-2-7b-chat-hf-20-attention-sparsity)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wang7776__Llama-2-7b-chat-hf-20-attention-sparsity\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-25T19:08:26.254884](https://huggingface.co/datasets/open-llm-leaderboard/details_wang7776__Llama-2-7b-chat-hf-20-attention-sparsity/blob/main/results_2024-01-25T19-08-26.254884.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4761943213953472,\n\
\ \"acc_stderr\": 0.03435344147831958,\n \"acc_norm\": 0.4811025232857774,\n\
\ \"acc_norm_stderr\": 0.03510926677242792,\n \"mc1\": 0.30599755201958384,\n\
\ \"mc1_stderr\": 0.016132229728155045,\n \"mc2\": 0.45842205614099396,\n\
\ \"mc2_stderr\": 0.015621084603112444\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5008532423208191,\n \"acc_stderr\": 0.014611369529813272,\n\
\ \"acc_norm\": 0.5341296928327645,\n \"acc_norm_stderr\": 0.014577311315231102\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.59061939852619,\n \
\ \"acc_stderr\": 0.004907146229347549,\n \"acc_norm\": 0.7791276638119896,\n\
\ \"acc_norm_stderr\": 0.0041398679751162995\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.506578947368421,\n \"acc_stderr\": 0.040685900502249704,\n\
\ \"acc_norm\": 0.506578947368421,\n \"acc_norm_stderr\": 0.040685900502249704\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5169811320754717,\n \"acc_stderr\": 0.030755120364119905,\n\
\ \"acc_norm\": 0.5169811320754717,\n \"acc_norm_stderr\": 0.030755120364119905\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04181210050035455,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04181210050035455\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3930635838150289,\n\
\ \"acc_stderr\": 0.03724249595817731,\n \"acc_norm\": 0.3930635838150289,\n\
\ \"acc_norm_stderr\": 0.03724249595817731\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.40425531914893614,\n \"acc_stderr\": 0.03208115750788684,\n\
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.03208115750788684\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\
\ \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n\
\ \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.28835978835978837,\n \"acc_stderr\": 0.02333065405453589,\n \"\
acc_norm\": 0.28835978835978837,\n \"acc_norm_stderr\": 0.02333065405453589\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.03809523809523811,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.03809523809523811\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5225806451612903,\n\
\ \"acc_stderr\": 0.02841498501970786,\n \"acc_norm\": 0.5225806451612903,\n\
\ \"acc_norm_stderr\": 0.02841498501970786\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3645320197044335,\n \"acc_stderr\": 0.033864057460620905,\n\
\ \"acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.033864057460620905\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5757575757575758,\n \"acc_stderr\": 0.03859268142070264,\n\
\ \"acc_norm\": 0.5757575757575758,\n \"acc_norm_stderr\": 0.03859268142070264\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.601010101010101,\n \"acc_stderr\": 0.03488901616852732,\n \"acc_norm\"\
: 0.601010101010101,\n \"acc_norm_stderr\": 0.03488901616852732\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.6683937823834197,\n \"acc_stderr\": 0.03397636541089118,\n\
\ \"acc_norm\": 0.6683937823834197,\n \"acc_norm_stderr\": 0.03397636541089118\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.41025641025641024,\n \"acc_stderr\": 0.024939313906940777,\n\
\ \"acc_norm\": 0.41025641025641024,\n \"acc_norm_stderr\": 0.024939313906940777\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844086,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844086\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.42016806722689076,\n \"acc_stderr\": 0.03206183783236152,\n\
\ \"acc_norm\": 0.42016806722689076,\n \"acc_norm_stderr\": 0.03206183783236152\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6587155963302752,\n \"acc_stderr\": 0.020328612816592442,\n \"\
acc_norm\": 0.6587155963302752,\n \"acc_norm_stderr\": 0.020328612816592442\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.32407407407407407,\n \"acc_stderr\": 0.03191923445686185,\n \"\
acc_norm\": 0.32407407407407407,\n \"acc_norm_stderr\": 0.03191923445686185\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6470588235294118,\n \"acc_stderr\": 0.033540924375915195,\n \"\
acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.033540924375915195\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6286919831223629,\n \"acc_stderr\": 0.0314506860074486,\n \
\ \"acc_norm\": 0.6286919831223629,\n \"acc_norm_stderr\": 0.0314506860074486\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5650224215246636,\n\
\ \"acc_stderr\": 0.033272833702713445,\n \"acc_norm\": 0.5650224215246636,\n\
\ \"acc_norm_stderr\": 0.033272833702713445\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.04328577215262972,\n\
\ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.04328577215262972\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968431,\n \"\
acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968431\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5648148148148148,\n\
\ \"acc_stderr\": 0.04792898170907061,\n \"acc_norm\": 0.5648148148148148,\n\
\ \"acc_norm_stderr\": 0.04792898170907061\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5398773006134969,\n \"acc_stderr\": 0.03915857291436971,\n\
\ \"acc_norm\": 0.5398773006134969,\n \"acc_norm_stderr\": 0.03915857291436971\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.04656147110012351,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.04656147110012351\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7008547008547008,\n\
\ \"acc_stderr\": 0.02999695185834947,\n \"acc_norm\": 0.7008547008547008,\n\
\ \"acc_norm_stderr\": 0.02999695185834947\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6756066411238825,\n\
\ \"acc_stderr\": 0.0167409290471627,\n \"acc_norm\": 0.6756066411238825,\n\
\ \"acc_norm_stderr\": 0.0167409290471627\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5173410404624278,\n \"acc_stderr\": 0.02690290045866664,\n\
\ \"acc_norm\": 0.5173410404624278,\n \"acc_norm_stderr\": 0.02690290045866664\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2324022346368715,\n\
\ \"acc_stderr\": 0.01412596875467338,\n \"acc_norm\": 0.2324022346368715,\n\
\ \"acc_norm_stderr\": 0.01412596875467338\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5065359477124183,\n \"acc_stderr\": 0.028627470550556054,\n\
\ \"acc_norm\": 0.5065359477124183,\n \"acc_norm_stderr\": 0.028627470550556054\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5562700964630225,\n\
\ \"acc_stderr\": 0.02821768355665232,\n \"acc_norm\": 0.5562700964630225,\n\
\ \"acc_norm_stderr\": 0.02821768355665232\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.558641975308642,\n \"acc_stderr\": 0.027628737155668773,\n\
\ \"acc_norm\": 0.558641975308642,\n \"acc_norm_stderr\": 0.027628737155668773\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36879432624113473,\n \"acc_stderr\": 0.028782227561347243,\n \
\ \"acc_norm\": 0.36879432624113473,\n \"acc_norm_stderr\": 0.028782227561347243\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3494132985658409,\n\
\ \"acc_stderr\": 0.012177306252786686,\n \"acc_norm\": 0.3494132985658409,\n\
\ \"acc_norm_stderr\": 0.012177306252786686\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4264705882352941,\n \"acc_stderr\": 0.030042615832714874,\n\
\ \"acc_norm\": 0.4264705882352941,\n \"acc_norm_stderr\": 0.030042615832714874\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4624183006535948,\n \"acc_stderr\": 0.020170614974969768,\n \
\ \"acc_norm\": 0.4624183006535948,\n \"acc_norm_stderr\": 0.020170614974969768\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n\
\ \"acc_stderr\": 0.04785964010794915,\n \"acc_norm\": 0.5181818181818182,\n\
\ \"acc_norm_stderr\": 0.04785964010794915\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5183673469387755,\n \"acc_stderr\": 0.03198761546763127,\n\
\ \"acc_norm\": 0.5183673469387755,\n \"acc_norm_stderr\": 0.03198761546763127\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6517412935323383,\n\
\ \"acc_stderr\": 0.033687874661154596,\n \"acc_norm\": 0.6517412935323383,\n\
\ \"acc_norm_stderr\": 0.033687874661154596\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7076023391812866,\n \"acc_stderr\": 0.03488647713457922,\n\
\ \"acc_norm\": 0.7076023391812866,\n \"acc_norm_stderr\": 0.03488647713457922\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30599755201958384,\n\
\ \"mc1_stderr\": 0.016132229728155045,\n \"mc2\": 0.45842205614099396,\n\
\ \"mc2_stderr\": 0.015621084603112444\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7071823204419889,\n \"acc_stderr\": 0.01278932111854262\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.177407126611069,\n \
\ \"acc_stderr\": 0.010522533016890783\n }\n}\n```"
repo_url: https://huggingface.co/wang7776/Llama-2-7b-chat-hf-20-attention-sparsity
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|arc:challenge|25_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|gsm8k|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hellaswag|10_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T19-08-26.254884.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T19-08-26.254884.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- '**/details_harness|winogrande|5_2024-01-25T19-08-26.254884.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-25T19-08-26.254884.parquet'
- config_name: results
data_files:
- split: 2024_01_25T19_08_26.254884
path:
- results_2024-01-25T19-08-26.254884.parquet
- split: latest
path:
- results_2024-01-25T19-08-26.254884.parquet
---
# Dataset Card for Evaluation run of wang7776/Llama-2-7b-chat-hf-20-attention-sparsity
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [wang7776/Llama-2-7b-chat-hf-20-attention-sparsity](https://huggingface.co/wang7776/Llama-2-7b-chat-hf-20-attention-sparsity) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wang7776__Llama-2-7b-chat-hf-20-attention-sparsity",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-25T19:08:26.254884](https://huggingface.co/datasets/open-llm-leaderboard/details_wang7776__Llama-2-7b-chat-hf-20-attention-sparsity/blob/main/results_2024-01-25T19-08-26.254884.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4761943213953472,
"acc_stderr": 0.03435344147831958,
"acc_norm": 0.4811025232857774,
"acc_norm_stderr": 0.03510926677242792,
"mc1": 0.30599755201958384,
"mc1_stderr": 0.016132229728155045,
"mc2": 0.45842205614099396,
"mc2_stderr": 0.015621084603112444
},
"harness|arc:challenge|25": {
"acc": 0.5008532423208191,
"acc_stderr": 0.014611369529813272,
"acc_norm": 0.5341296928327645,
"acc_norm_stderr": 0.014577311315231102
},
"harness|hellaswag|10": {
"acc": 0.59061939852619,
"acc_stderr": 0.004907146229347549,
"acc_norm": 0.7791276638119896,
"acc_norm_stderr": 0.0041398679751162995
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.506578947368421,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.506578947368421,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5169811320754717,
"acc_stderr": 0.030755120364119905,
"acc_norm": 0.5169811320754717,
"acc_norm_stderr": 0.030755120364119905
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5,
"acc_stderr": 0.04181210050035455,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04181210050035455
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3930635838150289,
"acc_stderr": 0.03724249595817731,
"acc_norm": 0.3930635838150289,
"acc_norm_stderr": 0.03724249595817731
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939392,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939392
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.04164188720169377,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.04164188720169377
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.28835978835978837,
"acc_stderr": 0.02333065405453589,
"acc_norm": 0.28835978835978837,
"acc_norm_stderr": 0.02333065405453589
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523811,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523811
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5225806451612903,
"acc_stderr": 0.02841498501970786,
"acc_norm": 0.5225806451612903,
"acc_norm_stderr": 0.02841498501970786
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.033864057460620905,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.033864057460620905
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5757575757575758,
"acc_stderr": 0.03859268142070264,
"acc_norm": 0.5757575757575758,
"acc_norm_stderr": 0.03859268142070264
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.601010101010101,
"acc_stderr": 0.03488901616852732,
"acc_norm": 0.601010101010101,
"acc_norm_stderr": 0.03488901616852732
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6683937823834197,
"acc_stderr": 0.03397636541089118,
"acc_norm": 0.6683937823834197,
"acc_norm_stderr": 0.03397636541089118
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.41025641025641024,
"acc_stderr": 0.024939313906940777,
"acc_norm": 0.41025641025641024,
"acc_norm_stderr": 0.024939313906940777
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.026593939101844086,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.026593939101844086
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42016806722689076,
"acc_stderr": 0.03206183783236152,
"acc_norm": 0.42016806722689076,
"acc_norm_stderr": 0.03206183783236152
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6587155963302752,
"acc_stderr": 0.020328612816592442,
"acc_norm": 0.6587155963302752,
"acc_norm_stderr": 0.020328612816592442
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.03191923445686185,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.03191923445686185
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.033540924375915195,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.033540924375915195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6286919831223629,
"acc_stderr": 0.0314506860074486,
"acc_norm": 0.6286919831223629,
"acc_norm_stderr": 0.0314506860074486
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5650224215246636,
"acc_stderr": 0.033272833702713445,
"acc_norm": 0.5650224215246636,
"acc_norm_stderr": 0.033272833702713445
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.04328577215262972,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.04328577215262972
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6198347107438017,
"acc_stderr": 0.04431324501968431,
"acc_norm": 0.6198347107438017,
"acc_norm_stderr": 0.04431324501968431
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.04792898170907061,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.04792898170907061
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5398773006134969,
"acc_stderr": 0.03915857291436971,
"acc_norm": 0.5398773006134969,
"acc_norm_stderr": 0.03915857291436971
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.04656147110012351,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.04656147110012351
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7008547008547008,
"acc_stderr": 0.02999695185834947,
"acc_norm": 0.7008547008547008,
"acc_norm_stderr": 0.02999695185834947
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6756066411238825,
"acc_stderr": 0.0167409290471627,
"acc_norm": 0.6756066411238825,
"acc_norm_stderr": 0.0167409290471627
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5173410404624278,
"acc_stderr": 0.02690290045866664,
"acc_norm": 0.5173410404624278,
"acc_norm_stderr": 0.02690290045866664
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2324022346368715,
"acc_stderr": 0.01412596875467338,
"acc_norm": 0.2324022346368715,
"acc_norm_stderr": 0.01412596875467338
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5065359477124183,
"acc_stderr": 0.028627470550556054,
"acc_norm": 0.5065359477124183,
"acc_norm_stderr": 0.028627470550556054
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5562700964630225,
"acc_stderr": 0.02821768355665232,
"acc_norm": 0.5562700964630225,
"acc_norm_stderr": 0.02821768355665232
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.558641975308642,
"acc_stderr": 0.027628737155668773,
"acc_norm": 0.558641975308642,
"acc_norm_stderr": 0.027628737155668773
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36879432624113473,
"acc_stderr": 0.028782227561347243,
"acc_norm": 0.36879432624113473,
"acc_norm_stderr": 0.028782227561347243
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3494132985658409,
"acc_stderr": 0.012177306252786686,
"acc_norm": 0.3494132985658409,
"acc_norm_stderr": 0.012177306252786686
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4264705882352941,
"acc_stderr": 0.030042615832714874,
"acc_norm": 0.4264705882352941,
"acc_norm_stderr": 0.030042615832714874
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4624183006535948,
"acc_stderr": 0.020170614974969768,
"acc_norm": 0.4624183006535948,
"acc_norm_stderr": 0.020170614974969768
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5181818181818182,
"acc_stderr": 0.04785964010794915,
"acc_norm": 0.5181818181818182,
"acc_norm_stderr": 0.04785964010794915
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5183673469387755,
"acc_stderr": 0.03198761546763127,
"acc_norm": 0.5183673469387755,
"acc_norm_stderr": 0.03198761546763127
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6517412935323383,
"acc_stderr": 0.033687874661154596,
"acc_norm": 0.6517412935323383,
"acc_norm_stderr": 0.033687874661154596
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7076023391812866,
"acc_stderr": 0.03488647713457922,
"acc_norm": 0.7076023391812866,
"acc_norm_stderr": 0.03488647713457922
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30599755201958384,
"mc1_stderr": 0.016132229728155045,
"mc2": 0.45842205614099396,
"mc2_stderr": 0.015621084603112444
},
"harness|winogrande|5": {
"acc": 0.7071823204419889,
"acc_stderr": 0.01278932111854262
},
"harness|gsm8k|5": {
"acc": 0.177407126611069,
"acc_stderr": 0.010522533016890783
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
enoahjr/twitter_dataset_1713186345 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 394691
num_examples: 1136
download_size: 191788
dataset_size: 394691
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AndyChiang/dgen | ---
pretty_name: dgen
multilinguality:
- monolingual
language:
- en
license:
- mit
size_categories:
- 1K<n<10K
tags:
- cloze
- sciq
- mcql
- ai2 science questions
task_categories:
- fill-mask
---
# dgen
**DGen** is a cloze questions dataset which covers multiple domains including science, vocabulary, common sense and trivia. It is compiled from a wide variety of datasets including SciQ, MCQL, AI2 Science Questions, etc. The detail of DGen dataset is shown below.
| DGen dataset | Train | Valid | Test | Total |
| ----------------------- | ----- | ----- | ---- | ----- |
| **Number of questions** | 2321 | 300 | 259 | 2880 |
Source: https://github.com/DRSY/DGen |
gunshigupta/traffic_world_demos | ---
license: creativeml-openrail-m
---
|
chargoddard/Open-Platypus-Chat | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 30710879
num_examples: 24887
download_size: 15122740
dataset_size: 30710879
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: other
task_categories:
- question-answering
- text-generation
language:
- en
size_categories:
- 10K<n<100K
---
# Dataset Card for "Open-Platypus-Chat"
This is the [Open-Platypus](https://huggingface.co/datasets/garage-bAInd/Open-Platypus) dataset converted to sharegpt format, with a handful of potential refusals removed. All credit to the OpenPlatypus team and the original authors of the various component datasets. |
Depie/Gastronomos_greek_recipes | ---
license: apache-2.0
---
|
OKR/OKR33 | ---
license: openrail
---
|
den2nova/den2SD | ---
license: cc0-1.0
---
LoRAデータセット開示用データ。私が普段よく使っている独自マージモデルで生成したイラストです。<br>
197枚、女性のイラストのみ収録。<br><br>
モデルマージの透明性確保のためのデータセット公開ですが、収録した画像データとタグが記載されているテキストファイルはご自由にご利用頂けます。<br>
ただし犯罪行為への利用や他人へ迷惑をかける行為に利用するのはおやめください。<br><br>
キャプションはwd14-taggerそのままで精査していません。
### LoRA本体もダウンロード可能です(SDHKv3.0で学習) |
aarda/all-data | ---
license: apache-2.0
---
|
seongs/maymust | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 153384
num_examples: 450
download_size: 57270
dataset_size: 153384
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_wnli_come_future | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 132
num_examples: 1
- name: test
num_bytes: 1448
num_examples: 5
- name: train
num_bytes: 2911
num_examples: 12
download_size: 9908
dataset_size: 4491
---
# Dataset Card for "MULTI_VALUE_wnli_come_future"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tollefj/NORTS | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: AssignmentId
dtype: string
- name: docId
dtype: string
- name: article
dtype: string
- name: tid1
dtype: int64
- name: tid2
dtype: int64
- name: words1
dtype: string
- name: words2
dtype: string
- name: phrases1
dtype: string
- name: phrases2
dtype: string
- name: sentences1
dtype: string
- name: sentences2
dtype: string
- name: summary1
dtype: string
- name: summary2
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 11384802
num_examples: 2400
- name: test
num_bytes: 2979312
num_examples: 600
download_size: 7539242
dataset_size: 14364114
---
# NORTS - Norwegian Topic-based Summarization Dataset
Translated from NORTS (NEWs Topic-based Summarization Dataset, https://github.com/ali-bahrainian/NEWTS) using the 1.3B NLLB model (https://huggingface.co/facebook/nllb-200-distilled-1.3B)
|
guigux/astro_qa_fr_0.1 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 1303701
num_examples: 4906
download_size: 692034
dataset_size: 1303701
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- text-generation
language:
- fr
tags:
- astrophysics
pretty_name: Astrophysics french QA
size_categories:
- 1K<n<10K
---
# Astrophysics french QA
The "Astrophysics french QA" dataset is an innovative collection combining scraped articles from the web with ChatGPT-generated question and answer pairs, offering a unique blend of information and interactive learning in the field of astrophysics. It contains almost 5k prompt / response generated by ChatGPT. It can be used to train / finetune / evaluate LLMs on astro subjects.
|
Rayhan588/AliRaHan | ---
license: apache-2.0
---
|
DBQ/Fendi.Product.prices.Italy | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: Italy - Fendi - Product-level price list
tags:
- webscraping
- ecommerce
- Fendi
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: website_name
dtype: string
- name: competence_date
dtype: string
- name: country_code
dtype: string
- name: currency_code
dtype: string
- name: brand
dtype: string
- name: category1_code
dtype: string
- name: category2_code
dtype: string
- name: category3_code
dtype: string
- name: product_code
dtype: string
- name: title
dtype: string
- name: itemurl
dtype: string
- name: imageurl
dtype: string
- name: full_price
dtype: float64
- name: price
dtype: float64
- name: full_price_eur
dtype: float64
- name: price_eur
dtype: float64
- name: flg_discount
dtype: int64
splits:
- name: train
num_bytes: 806401
num_examples: 1995
download_size: 253225
dataset_size: 806401
---
# Fendi web scraped data
## About the website
Operating within the **luxury fashion industry**, **Fendi** is a notable player in the EMEA region, particularly in its home country, **Italy**. Known for its innovative Italian craftsmanship, it represents an essential pillar of the **Italian luxury goods sector**. The industry is characterized by artisanal production, prestigious brands, premium prices, and a global customer base focusing on exclusivity and quality. A significant trend in this industry is the shift towards **digital platforms** and **Ecommerce** to meet evolving consumer behaviours. Considering this context, the dataset reflects **Ecommerce product-list page (PLP) data on Fendi** in Italy, unveiling consumer preferences and trends in Italian luxury shopping.
## Link to **dataset**
[Italy - Fendi - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Fendi%20Product-prices%20Italy/r/recGB0fxVukeTKxdK)
|
AdapterOcean/med_alpaca_standardized_cluster_9 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 70212749
num_examples: 8302
download_size: 17761402
dataset_size: 70212749
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_9"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Deivid457/Mc-Jhony | ---
license: openrail
---
|
Someman/nepali-flag | ---
license: mit
---
|
TheGreatRambler/mm2_ninji_level | ---
language:
- multilingual
license:
- cc-by-nc-sa-4.0
multilinguality:
- multilingual
size_categories:
- n<1K
source_datasets:
- original
task_categories:
- other
- object-detection
- text-retrieval
- token-classification
- text-generation
task_ids: []
pretty_name: Mario Maker 2 ninji levels
tags:
- text-mining
---
# Mario Maker 2 ninji levels
Part of the [Mario Maker 2 Dataset Collection](https://tgrcode.com/posts/mario_maker_2_datasets)
## Dataset Description
The Mario Maker 2 ninji levels dataset consists of 21 ninji levels from Nintendo's online service and aids `TheGreatRambler/mm2_ninji`. The dataset was created using the self-hosted [Mario Maker 2 api](https://tgrcode.com/posts/mario_maker_2_api) over the course of 1 month in February 2022.
### How to use it
You can load and iterate through the dataset with the following code:
```python
from datasets import load_dataset
ds = load_dataset("TheGreatRambler/mm2_ninji_level", split="train")
print(next(iter(ds)))
#OUTPUT:
{
'data_id': 12171034,
'name': 'Rolling Snowballs',
'description': 'Make your way through the snowfields, and keep an eye\nout for Spikes and Snow Pokeys! Stomping on Snow Pokeys\nwill turn them into small snowballs, which you can pick up\nand throw. Play this course as many times as you want,\nand see if you can find the fastest way to the finish!',
'uploaded': 1575532800,
'ended': 1576137600,
'gamestyle': 3,
'theme': 6,
'medal_time': 26800,
'clear_condition': 0,
'clear_condition_magnitude': 0,
'unk3_0': 1309513,
'unk3_1': 62629737,
'unk3_2': 4355893,
'unk5': 1,
'unk6': 0,
'unk9': 0,
'level_data': [some binary data]
}
```
Each row is a ninji level denoted by `data_id`. `TheGreatRambler/mm2_ninji` refers to these levels. `level_data` is the same format used in `TheGreatRambler/mm2_level` and the provided Kaitai struct file and `level.py` can be used to decode it:
```python
from datasets import load_dataset
from kaitaistruct import KaitaiStream
from io import BytesIO
from level import Level
import zlib
ds = load_dataset("TheGreatRambler/mm2_ninji_level", split="train")
level_data = next(iter(ds))["level_data"]
level = Level(KaitaiStream(BytesIO(zlib.decompress(level_data))))
# NOTE level.overworld.objects is a fixed size (limitation of Kaitai struct)
# must iterate by object_count or null objects will be included
for i in range(level.overworld.object_count):
obj = level.overworld.objects[i]
print("X: %d Y: %d ID: %s" % (obj.x, obj.y, obj.id))
#OUTPUT:
X: 1200 Y: 400 ID: ObjId.block
X: 1360 Y: 400 ID: ObjId.block
X: 1360 Y: 240 ID: ObjId.block
X: 1520 Y: 240 ID: ObjId.block
X: 1680 Y: 240 ID: ObjId.block
X: 1680 Y: 400 ID: ObjId.block
X: 1840 Y: 400 ID: ObjId.block
X: 2000 Y: 400 ID: ObjId.block
X: 2160 Y: 400 ID: ObjId.block
X: 2320 Y: 400 ID: ObjId.block
X: 2480 Y: 560 ID: ObjId.block
X: 2480 Y: 720 ID: ObjId.block
X: 2480 Y: 880 ID: ObjId.block
X: 2160 Y: 880 ID: ObjId.block
```
## Data Structure
### Data Instances
```python
{
'data_id': 12171034,
'name': 'Rolling Snowballs',
'description': 'Make your way through the snowfields, and keep an eye\nout for Spikes and Snow Pokeys! Stomping on Snow Pokeys\nwill turn them into small snowballs, which you can pick up\nand throw. Play this course as many times as you want,\nand see if you can find the fastest way to the finish!',
'uploaded': 1575532800,
'ended': 1576137600,
'gamestyle': 3,
'theme': 6,
'medal_time': 26800,
'clear_condition': 0,
'clear_condition_magnitude': 0,
'unk3_0': 1309513,
'unk3_1': 62629737,
'unk3_2': 4355893,
'unk5': 1,
'unk6': 0,
'unk9': 0,
'level_data': [some binary data]
}
```
### Data Fields
|Field|Type|Description|
|---|---|---|
|data_id|int|The data ID of this ninji level|
|name|string|Name|
|description|string|Description|
|uploaded|int|UTC timestamp of when this was uploaded|
|ended|int|UTC timestamp of when this event ended|
|gamestyle|int|Gamestyle, enum below|
|theme|int|Theme, enum below|
|medal_time|int|Time to get a medal in milliseconds|
|clear_condition|int|Clear condition, enum below|
|clear_condition_magnitude|int|If applicable, the magnitude of the clear condition|
|unk3_0|int|Unknown|
|unk3_1|int|Unknown|
|unk3_2|int|Unknown|
|unk5|int|Unknown|
|unk6|int|Unknown|
|unk9|int|Unknown|
|level_data|bytes|The GZIP compressed decrypted level data, a kaitai struct file is provided to read this|
|one_screen_thumbnail|bytes|The one screen course thumbnail, as a JPEG binary|
|one_screen_thumbnail_url|string|The old URL of this thumbnail|
|one_screen_thumbnail_size|int|The filesize of this thumbnail|
|one_screen_thumbnail_filename|string|The filename of this thumbnail|
|entire_thumbnail|bytes|The entire course thumbnail, as a JPEG binary|
|entire_thumbnail_url|string|The old URL of this thumbnail|
|entire_thumbnail_size|int|The filesize of this thumbnail|
|entire_thumbnail_filename|string|The filename of this thumbnail|
### Data Splits
The dataset only contains a train split.
## Enums
The dataset contains some enum integer fields. They match those used by `TheGreatRambler/mm2_level` for the most part, but they are reproduced below:
```python
GameStyles = {
0: "SMB1",
1: "SMB3",
2: "SMW",
3: "NSMBU",
4: "SM3DW"
}
CourseThemes = {
0: "Overworld",
1: "Underground",
2: "Castle",
3: "Airship",
4: "Underwater",
5: "Ghost house",
6: "Snow",
7: "Desert",
8: "Sky",
9: "Forest"
}
ClearConditions = {
137525990: "Reach the goal without landing after leaving the ground.",
199585683: "Reach the goal after defeating at least/all (n) Mechakoopa(s).",
272349836: "Reach the goal after defeating at least/all (n) Cheep Cheep(s).",
375673178: "Reach the goal without taking damage.",
426197923: "Reach the goal as Boomerang Mario.",
436833616: "Reach the goal while wearing a Shoe.",
713979835: "Reach the goal as Fire Mario.",
744927294: "Reach the goal as Frog Mario.",
751004331: "Reach the goal after defeating at least/all (n) Larry(s).",
900050759: "Reach the goal as Raccoon Mario.",
947659466: "Reach the goal after defeating at least/all (n) Blooper(s).",
976173462: "Reach the goal as Propeller Mario.",
994686866: "Reach the goal while wearing a Propeller Box.",
998904081: "Reach the goal after defeating at least/all (n) Spike(s).",
1008094897: "Reach the goal after defeating at least/all (n) Boom Boom(s).",
1051433633: "Reach the goal while holding a Koopa Shell.",
1061233896: "Reach the goal after defeating at least/all (n) Porcupuffer(s).",
1062253843: "Reach the goal after defeating at least/all (n) Charvaargh(s).",
1079889509: "Reach the goal after defeating at least/all (n) Bullet Bill(s).",
1080535886: "Reach the goal after defeating at least/all (n) Bully/Bullies.",
1151250770: "Reach the goal while wearing a Goomba Mask.",
1182464856: "Reach the goal after defeating at least/all (n) Hop-Chops.",
1219761531: "Reach the goal while holding a Red POW Block. OR Reach the goal after activating at least/all (n) Red POW Block(s).",
1221661152: "Reach the goal after defeating at least/all (n) Bob-omb(s).",
1259427138: "Reach the goal after defeating at least/all (n) Spiny/Spinies.",
1268255615: "Reach the goal after defeating at least/all (n) Bowser(s)/Meowser(s).",
1279580818: "Reach the goal after defeating at least/all (n) Ant Trooper(s).",
1283945123: "Reach the goal on a Lakitu's Cloud.",
1344044032: "Reach the goal after defeating at least/all (n) Boo(s).",
1425973877: "Reach the goal after defeating at least/all (n) Roy(s).",
1429902736: "Reach the goal while holding a Trampoline.",
1431944825: "Reach the goal after defeating at least/all (n) Morton(s).",
1446467058: "Reach the goal after defeating at least/all (n) Fish Bone(s).",
1510495760: "Reach the goal after defeating at least/all (n) Monty Mole(s).",
1656179347: "Reach the goal after picking up at least/all (n) 1-Up Mushroom(s).",
1665820273: "Reach the goal after defeating at least/all (n) Hammer Bro(s.).",
1676924210: "Reach the goal after hitting at least/all (n) P Switch(es). OR Reach the goal while holding a P Switch.",
1715960804: "Reach the goal after activating at least/all (n) POW Block(s). OR Reach the goal while holding a POW Block.",
1724036958: "Reach the goal after defeating at least/all (n) Angry Sun(s).",
1730095541: "Reach the goal after defeating at least/all (n) Pokey(s).",
1780278293: "Reach the goal as Superball Mario.",
1839897151: "Reach the goal after defeating at least/all (n) Pom Pom(s).",
1969299694: "Reach the goal after defeating at least/all (n) Peepa(s).",
2035052211: "Reach the goal after defeating at least/all (n) Lakitu(s).",
2038503215: "Reach the goal after defeating at least/all (n) Lemmy(s).",
2048033177: "Reach the goal after defeating at least/all (n) Lava Bubble(s).",
2076496776: "Reach the goal while wearing a Bullet Bill Mask.",
2089161429: "Reach the goal as Big Mario.",
2111528319: "Reach the goal as Cat Mario.",
2131209407: "Reach the goal after defeating at least/all (n) Goomba(s)/Galoomba(s).",
2139645066: "Reach the goal after defeating at least/all (n) Thwomp(s).",
2259346429: "Reach the goal after defeating at least/all (n) Iggy(s).",
2549654281: "Reach the goal while wearing a Dry Bones Shell.",
2694559007: "Reach the goal after defeating at least/all (n) Sledge Bro(s.).",
2746139466: "Reach the goal after defeating at least/all (n) Rocky Wrench(es).",
2749601092: "Reach the goal after grabbing at least/all (n) 50-Coin(s).",
2855236681: "Reach the goal as Flying Squirrel Mario.",
3036298571: "Reach the goal as Buzzy Mario.",
3074433106: "Reach the goal as Builder Mario.",
3146932243: "Reach the goal as Cape Mario.",
3174413484: "Reach the goal after defeating at least/all (n) Wendy(s).",
3206222275: "Reach the goal while wearing a Cannon Box.",
3314955857: "Reach the goal as Link.",
3342591980: "Reach the goal while you have Super Star invincibility.",
3346433512: "Reach the goal after defeating at least/all (n) Goombrat(s)/Goombud(s).",
3348058176: "Reach the goal after grabbing at least/all (n) 10-Coin(s).",
3353006607: "Reach the goal after defeating at least/all (n) Buzzy Beetle(s).",
3392229961: "Reach the goal after defeating at least/all (n) Bowser Jr.(s).",
3437308486: "Reach the goal after defeating at least/all (n) Koopa Troopa(s).",
3459144213: "Reach the goal after defeating at least/all (n) Chain Chomp(s).",
3466227835: "Reach the goal after defeating at least/all (n) Muncher(s).",
3481362698: "Reach the goal after defeating at least/all (n) Wiggler(s).",
3513732174: "Reach the goal as SMB2 Mario.",
3649647177: "Reach the goal in a Koopa Clown Car/Junior Clown Car.",
3725246406: "Reach the goal as Spiny Mario.",
3730243509: "Reach the goal in a Koopa Troopa Car.",
3748075486: "Reach the goal after defeating at least/all (n) Piranha Plant(s)/Jumping Piranha Plant(s).",
3797704544: "Reach the goal after defeating at least/all (n) Dry Bones.",
3824561269: "Reach the goal after defeating at least/all (n) Stingby/Stingbies.",
3833342952: "Reach the goal after defeating at least/all (n) Piranha Creeper(s).",
3842179831: "Reach the goal after defeating at least/all (n) Fire Piranha Plant(s).",
3874680510: "Reach the goal after breaking at least/all (n) Crates(s).",
3974581191: "Reach the goal after defeating at least/all (n) Ludwig(s).",
3977257962: "Reach the goal as Super Mario.",
4042480826: "Reach the goal after defeating at least/all (n) Skipsqueak(s).",
4116396131: "Reach the goal after grabbing at least/all (n) Coin(s).",
4117878280: "Reach the goal after defeating at least/all (n) Magikoopa(s).",
4122555074: "Reach the goal after grabbing at least/all (n) 30-Coin(s).",
4153835197: "Reach the goal as Balloon Mario.",
4172105156: "Reach the goal while wearing a Red POW Box.",
4209535561: "Reach the Goal while riding Yoshi.",
4269094462: "Reach the goal after defeating at least/all (n) Spike Top(s).",
4293354249: "Reach the goal after defeating at least/all (n) Banzai Bill(s)."
}
```
<!-- TODO create detailed statistics -->
## Dataset Creation
The dataset was created over a little more than a month in Febuary 2022 using the self hosted [Mario Maker 2 api](https://tgrcode.com/posts/mario_maker_2_api). As requests made to Nintendo's servers require authentication the process had to be done with upmost care and limiting download speed as to not overload the API and risk a ban. There are no intentions to create an updated release of this dataset.
## Considerations for Using the Data
As these 21 levels were made and vetted by Nintendo the dataset contains no harmful language or depictions.
|
ura-hcmut/vmlu_vi | ---
language:
- vi
size_categories:
- 1K<n<10K
configs:
- config_name: default
data_files:
- split: test
path: test.jsonl
- split: valid
path: valid.jsonl
- split: dev
path: dev.jsonl
--- |
japanese-asr/whisper_transcriptions.reazonspeech.all_31 | ---
dataset_info:
config_name: all
features:
- name: name
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: whisper_transcript
sequence: int64
splits:
- name: train
num_bytes: 30558823060.0
num_examples: 267996
download_size: 30317834028
dataset_size: 30558823060.0
configs:
- config_name: all
data_files:
- split: train
path: all/train-*
---
|
jazzysnake01/oasst-1-hun-openai | ---
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 8053160
num_examples: 5312
download_size: 3874931
dataset_size: 8053160
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mHossain/final_train_v4_test_40000 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: input_text
dtype: string
- name: target_text
dtype: string
- name: prefix
dtype: string
splits:
- name: train
num_bytes: 5756432.4
num_examples: 18000
- name: test
num_bytes: 639603.6
num_examples: 2000
download_size: 2778012
dataset_size: 6396036.0
---
# Dataset Card for "final_train_v4_test_40000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Pampkinus/Volodymyr-Zelenskyj | ---
license: openrail
---
Faceset of the current prezident of Ukraine, 8480 aligned pictures (JPG) of his face from the latest UN meating
https://cs.wikipedia.org/wiki/Volodymyr_Zelenskyj |
kgr123/quality_counter_3500 | ---
dataset_info:
features:
- name: context
dtype: string
- name: word
dtype: string
- name: claim
dtype: string
- name: label
dtype: int64
splits:
- name: test
num_bytes: 19330578
num_examples: 1929
- name: train
num_bytes: 19139136
num_examples: 1935
- name: validation
num_bytes: 19531896
num_examples: 1941
download_size: 12876045
dataset_size: 58001610
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
YaNWoni/GARAM | ---
license: openrail
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.