datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
joey234/mmlu-elementary_mathematics-dev | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: dev
num_bytes: 3037
num_examples: 5
download_size: 0
dataset_size: 3037
---
# Dataset Card for "mmlu-elementary_mathematics-dev"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/Hatefulmemes_validation_google_flan_t5_xl_mode_C_A_T_OCR_rices_ns_500 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
sequence: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0
num_bytes: 596556
num_examples: 500
download_size: 111787
dataset_size: 596556
---
# Dataset Card for "Hatefulmemes_validation_google_flan_t5_xl_mode_C_A_T_OCR_rices_ns_500"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/data-standardized_cluster_11_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 9401643
num_examples: 3901
download_size: 3993365
dataset_size: 9401643
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data-standardized_cluster_11_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nampdn-ai/mini-vncc-envi | ---
license: apache-2.0
---
|
Utkarsh736/IPL_MALE | ---
license: mit
---
|
allegro/klej-cdsc-e | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- pl
license:
- cc-by-nc-sa-4.0
multilinguality:
- monolingual
pretty_name: 'CDSC-E'
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- natural-language-inference
---
# klej-cdsc-e
## Description
Polish CDSCorpus consists of 10K Polish sentence pairs which are human-annotated for semantic relatedness (**CDSC-R**) and entailment (**CDSC-E**). The dataset may be used to evaluate compositional distributional semantics models of Polish. The dataset was presented at ACL 2017.
Although the SICK corpus inspires the main design of the dataset, it differs in detail. As in SICK, the sentences come from image captions, but the set of chosen images is much more diverse as they come from 46 thematic groups.
## Tasks (input, output, and metrics)
The entailment relation between two sentences is labeled with *entailment*, *contradiction*, or *neutral*. The task is to predict if the premise entails the hypothesis (entailment), negates the hypothesis (contradiction), or is unrelated (neutral).
b **entails** a (a **wynika z** b) – if a situation or an event described by sentence b occurs, it is recognized that a situation or an event described by a occurs as well, i.e., a and b refer to the same event or the same situation;
**Input**: ('sentence_A', 'sentence_B'): sentence pair
**Output** ('entailment_judgment' column): one of the possible entailment relations (*entailment*, *contradiction*, *neutral*)
**Domain:** image captions
**Measurements**: Accuracy
**Example:**
Input: `Żaden mężczyzna nie stoi na przystanku autobusowym.` ; `Mężczyzna z żółtą i białą reklamówką w ręce stoi na przystanku obok autobusu.`
Input (translated by DeepL): `No man standing at the bus stop.` ; `A man with a yellow and white bag in his hand stands at a bus stop next to a bus.`
Output: `entailment`
## Data splits
| Subset | Cardinality |
| ------------- | ----------: |
| train | 8000 |
| validation | 1000 |
| test | 1000 |
## Class distribution
| Class | train | validation | test |
|:--------------|--------:|-------------:|-------:|
| NEUTRAL | 0.744 | 0.741 | 0.744 |
| ENTAILMENT | 0.179 | 0.185 | 0.190 |
| CONTRADICTION | 0.077 | 0.074 | 0.066 |
## Citation
```
@inproceedings{wroblewska-krasnowska-kieras-2017-polish,
title = "{P}olish evaluation dataset for compositional distributional semantics models",
author = "Wr{\'o}blewska, Alina and
Krasnowska-Kiera{\'s}, Katarzyna",
booktitle = "Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
month = jul,
year = "2017",
address = "Vancouver, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/P17-1073",
doi = "10.18653/v1/P17-1073",
pages = "784--792",
abstract = "The paper presents a procedure of building an evaluation dataset. for the validation of compositional distributional semantics models estimated for languages other than English. The procedure generally builds on steps designed to assemble the SICK corpus, which contains pairs of English sentences annotated for semantic relatedness and entailment, because we aim at building a comparable dataset. However, the implementation of particular building steps significantly differs from the original SICK design assumptions, which is caused by both lack of necessary extraneous resources for an investigated language and the need for language-specific transformation rules. The designed procedure is verified on Polish, a fusional language with a relatively free word order, and contributes to building a Polish evaluation dataset. The resource consists of 10K sentence pairs which are human-annotated for semantic relatedness and entailment. The dataset may be used for the evaluation of compositional distributional semantics models of Polish.",
}
```
## License
```
Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0)
```
## Links
[HuggingFace](https://huggingface.co/datasets/allegro/klej-cdsc-e)
[Source](http://zil.ipipan.waw.pl/Scwad/CDSCorpus)
[Paper](https://aclanthology.org/P17-1073.pdf)
## Examples
### Loading
```python
from pprint import pprint
from datasets import load_dataset
dataset = load_dataset("allegro/klej-cdsc-e")
pprint(dataset["train"][0])
# {'entailment_judgment': 'NEUTRAL',
# 'pair_ID': 1,
# 'sentence_A': 'Chłopiec w czerwonych trampkach skacze wysoko do góry '
# 'nieopodal fontanny .',
# 'sentence_B': 'Chłopiec w bluzce w paski podskakuje wysoko obok brązowej '
# 'fontanny .'}
```
### Evaluation
```python
import random
from pprint import pprint
from datasets import load_dataset, load_metric
dataset = load_dataset("allegro/klej-cdsc-e")
dataset = dataset.class_encode_column("entailment_judgment")
references = dataset["test"]["entailment_judgment"]
# generate random predictions
predictions = [random.randrange(max(references) + 1) for _ in range(len(references))]
acc = load_metric("accuracy")
f1 = load_metric("f1")
acc_score = acc.compute(predictions=predictions, references=references)
f1_score = f1.compute(predictions=predictions, references=references, average="macro")
pprint(acc_score)
pprint(f1_score)
# {'accuracy': 0.325}
# {'f1': 0.2736171695141161}
``` |
irds/gov_trec-web-2002_named-page | ---
pretty_name: '`gov/trec-web-2002/named-page`'
viewer: false
source_datasets: ['irds/gov']
task_categories:
- text-retrieval
---
# Dataset Card for `gov/trec-web-2002/named-page`
The `gov/trec-web-2002/named-page` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/gov#gov/trec-web-2002/named-page).
# Data
This dataset provides:
- `queries` (i.e., topics); count=150
- `qrels`: (relevance assessments); count=170
- For `docs`, use [`irds/gov`](https://huggingface.co/datasets/irds/gov)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/gov_trec-web-2002_named-page', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/gov_trec-web-2002_named-page', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Craswell2002TrecWeb,
title={Overview of the TREC-2002 Web Track},
author={Nick Craswell and David Hawking},
booktitle={TREC},
year={2002}
}
```
|
Seanxh/twitter_dataset_1713097662 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 25699
num_examples: 61
download_size: 14814
dataset_size: 25699
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_NLUHOPOE__experiment2-cause | ---
pretty_name: Evaluation run of NLUHOPOE/experiment2-cause
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NLUHOPOE/experiment2-cause](https://huggingface.co/NLUHOPOE/experiment2-cause)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NLUHOPOE__experiment2-cause\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-02T01:23:01.648750](https://huggingface.co/datasets/open-llm-leaderboard/details_NLUHOPOE__experiment2-cause/blob/main/results_2024-03-02T01-23-01.648750.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6187092967805649,\n\
\ \"acc_stderr\": 0.03277112039995135,\n \"acc_norm\": 0.6247066510159702,\n\
\ \"acc_norm_stderr\": 0.03344268306035278,\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.016419874731135025,\n \"mc2\": 0.4713263218122602,\n\
\ \"mc2_stderr\": 0.01458246045981096\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5554607508532423,\n \"acc_stderr\": 0.014521226405627077,\n\
\ \"acc_norm\": 0.6040955631399317,\n \"acc_norm_stderr\": 0.014291228393536588\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6210914160525791,\n\
\ \"acc_stderr\": 0.004841238763529372,\n \"acc_norm\": 0.8276239792869946,\n\
\ \"acc_norm_stderr\": 0.003769350079195885\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n\
\ \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n\
\ \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520196,\n \"\
acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520196\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7258064516129032,\n\
\ \"acc_stderr\": 0.025378139970885196,\n \"acc_norm\": 0.7258064516129032,\n\
\ \"acc_norm_stderr\": 0.025378139970885196\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.02614848346915333,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.02614848346915333\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6282051282051282,\n \"acc_stderr\": 0.024503472557110932,\n\
\ \"acc_norm\": 0.6282051282051282,\n \"acc_norm_stderr\": 0.024503472557110932\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7926605504587156,\n \"acc_stderr\": 0.017381415563608674,\n \"\
acc_norm\": 0.7926605504587156,\n \"acc_norm_stderr\": 0.017381415563608674\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5555555555555556,\n \"acc_stderr\": 0.03388857118502325,\n \"\
acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03388857118502325\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.03019028245350195,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.03019028245350195\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.02765215314415925,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.02765215314415925\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728743,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728743\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.0246624968452098,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.0246624968452098\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8033205619412516,\n\
\ \"acc_stderr\": 0.014214138556913917,\n \"acc_norm\": 0.8033205619412516,\n\
\ \"acc_norm_stderr\": 0.014214138556913917\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n\
\ \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3743016759776536,\n\
\ \"acc_stderr\": 0.016185444179457175,\n \"acc_norm\": 0.3743016759776536,\n\
\ \"acc_norm_stderr\": 0.016185444179457175\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n\
\ \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.025773111169630457,\n\
\ \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.025773111169630457\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4380704041720991,\n\
\ \"acc_stderr\": 0.012671902782567654,\n \"acc_norm\": 0.4380704041720991,\n\
\ \"acc_norm_stderr\": 0.012671902782567654\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6102941176470589,\n \"acc_stderr\": 0.0296246635811597,\n\
\ \"acc_norm\": 0.6102941176470589,\n \"acc_norm_stderr\": 0.0296246635811597\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6405228758169934,\n \"acc_stderr\": 0.01941253924203216,\n \
\ \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.01941253924203216\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233264,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233264\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03126781714663179,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03126781714663179\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.016419874731135025,\n \"mc2\": 0.4713263218122602,\n\
\ \"mc2_stderr\": 0.01458246045981096\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7884767166535123,\n \"acc_stderr\": 0.011477747684223188\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3502653525398029,\n \
\ \"acc_stderr\": 0.013140409455571269\n }\n}\n```"
repo_url: https://huggingface.co/NLUHOPOE/experiment2-cause
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|arc:challenge|25_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|arc:challenge|25_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|gsm8k|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|gsm8k|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hellaswag|10_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hellaswag|10_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T13-40-54.302195.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T01-23-01.648750.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-02T01-23-01.648750.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- '**/details_harness|winogrande|5_2024-03-01T13-40-54.302195.parquet'
- split: 2024_03_02T01_23_01.648750
path:
- '**/details_harness|winogrande|5_2024-03-02T01-23-01.648750.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-02T01-23-01.648750.parquet'
- config_name: results
data_files:
- split: 2024_03_01T13_40_54.302195
path:
- results_2024-03-01T13-40-54.302195.parquet
- split: 2024_03_02T01_23_01.648750
path:
- results_2024-03-02T01-23-01.648750.parquet
- split: latest
path:
- results_2024-03-02T01-23-01.648750.parquet
---
# Dataset Card for Evaluation run of NLUHOPOE/experiment2-cause
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NLUHOPOE/experiment2-cause](https://huggingface.co/NLUHOPOE/experiment2-cause) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NLUHOPOE__experiment2-cause",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-02T01:23:01.648750](https://huggingface.co/datasets/open-llm-leaderboard/details_NLUHOPOE__experiment2-cause/blob/main/results_2024-03-02T01-23-01.648750.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6187092967805649,
"acc_stderr": 0.03277112039995135,
"acc_norm": 0.6247066510159702,
"acc_norm_stderr": 0.03344268306035278,
"mc1": 0.3268053855569155,
"mc1_stderr": 0.016419874731135025,
"mc2": 0.4713263218122602,
"mc2_stderr": 0.01458246045981096
},
"harness|arc:challenge|25": {
"acc": 0.5554607508532423,
"acc_stderr": 0.014521226405627077,
"acc_norm": 0.6040955631399317,
"acc_norm_stderr": 0.014291228393536588
},
"harness|hellaswag|10": {
"acc": 0.6210914160525791,
"acc_stderr": 0.004841238763529372,
"acc_norm": 0.8276239792869946,
"acc_norm_stderr": 0.003769350079195885
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520196,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520196
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7258064516129032,
"acc_stderr": 0.025378139970885196,
"acc_norm": 0.7258064516129032,
"acc_norm_stderr": 0.025378139970885196
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.02614848346915333,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.02614848346915333
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6282051282051282,
"acc_stderr": 0.024503472557110932,
"acc_norm": 0.6282051282051282,
"acc_norm_stderr": 0.024503472557110932
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7926605504587156,
"acc_stderr": 0.017381415563608674,
"acc_norm": 0.7926605504587156,
"acc_norm_stderr": 0.017381415563608674
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.02765215314415925,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.02765215314415925
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728743,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728743
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.0246624968452098,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.0246624968452098
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8033205619412516,
"acc_stderr": 0.014214138556913917,
"acc_norm": 0.8033205619412516,
"acc_norm_stderr": 0.014214138556913917
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3743016759776536,
"acc_stderr": 0.016185444179457175,
"acc_norm": 0.3743016759776536,
"acc_norm_stderr": 0.016185444179457175
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.02617390850671858,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.02617390850671858
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6882716049382716,
"acc_stderr": 0.025773111169630457,
"acc_norm": 0.6882716049382716,
"acc_norm_stderr": 0.025773111169630457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4380704041720991,
"acc_stderr": 0.012671902782567654,
"acc_norm": 0.4380704041720991,
"acc_norm_stderr": 0.012671902782567654
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6102941176470589,
"acc_stderr": 0.0296246635811597,
"acc_norm": 0.6102941176470589,
"acc_norm_stderr": 0.0296246635811597
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.01941253924203216,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.01941253924203216
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.02879518557429129,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.02879518557429129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233264,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233264
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03126781714663179,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03126781714663179
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3268053855569155,
"mc1_stderr": 0.016419874731135025,
"mc2": 0.4713263218122602,
"mc2_stderr": 0.01458246045981096
},
"harness|winogrande|5": {
"acc": 0.7884767166535123,
"acc_stderr": 0.011477747684223188
},
"harness|gsm8k|5": {
"acc": 0.3502653525398029,
"acc_stderr": 0.013140409455571269
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
neeeeellllll/Financial_Sentiment_Analyis_dataset | ---
license: unknown
---
|
jyhasder34/demang | ---
license: bigscience-bloom-rail-1.0
---
|
open-llm-leaderboard/details_AA051610__O0128 | ---
pretty_name: Evaluation run of AA051610/O0128
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AA051610/O0128](https://huggingface.co/AA051610/O0128) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051610__O0128\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-28T17:02:36.892419](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__O0128/blob/main/results_2024-01-28T17-02-36.892419.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.8273006081081993,\n\
\ \"acc_stderr\": 0.024663470781539607,\n \"acc_norm\": 0.8335369148949214,\n\
\ \"acc_norm_stderr\": 0.025075862506569718,\n \"mc1\": 0.43451652386780903,\n\
\ \"mc1_stderr\": 0.017352738749259564,\n \"mc2\": 0.6012752268438828,\n\
\ \"mc2_stderr\": 0.014979362035595621\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6552901023890785,\n \"acc_stderr\": 0.01388881628678211,\n\
\ \"acc_norm\": 0.6791808873720137,\n \"acc_norm_stderr\": 0.013640943091946528\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6569408484365664,\n\
\ \"acc_stderr\": 0.004737608340163403,\n \"acc_norm\": 0.853415654252141,\n\
\ \"acc_norm_stderr\": 0.003529682285857263\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.8222222222222222,\n\
\ \"acc_stderr\": 0.033027898599017176,\n \"acc_norm\": 0.8222222222222222,\n\
\ \"acc_norm_stderr\": 0.033027898599017176\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.9013157894736842,\n \"acc_stderr\": 0.02427022773752271,\n\
\ \"acc_norm\": 0.9013157894736842,\n \"acc_norm_stderr\": 0.02427022773752271\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.82,\n\
\ \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.82,\n \
\ \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8641509433962264,\n \"acc_stderr\": 0.02108730862243985,\n\
\ \"acc_norm\": 0.8641509433962264,\n \"acc_norm_stderr\": 0.02108730862243985\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9722222222222222,\n\
\ \"acc_stderr\": 0.013742429025504266,\n \"acc_norm\": 0.9722222222222222,\n\
\ \"acc_norm_stderr\": 0.013742429025504266\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\"\
: 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562427,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562427\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.8265895953757225,\n\
\ \"acc_stderr\": 0.02886810787497064,\n \"acc_norm\": 0.8265895953757225,\n\
\ \"acc_norm_stderr\": 0.02886810787497064\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.047551296160629475,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.047551296160629475\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \"acc_norm\": 0.86,\n\
\ \"acc_norm_stderr\": 0.03487350880197771\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.8468085106382979,\n \"acc_stderr\": 0.023545179061675203,\n\
\ \"acc_norm\": 0.8468085106382979,\n \"acc_norm_stderr\": 0.023545179061675203\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.7105263157894737,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.7105263157894737,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.8620689655172413,\n \"acc_stderr\": 0.028735632183908084,\n\
\ \"acc_norm\": 0.8620689655172413,\n \"acc_norm_stderr\": 0.028735632183908084\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.791005291005291,\n \"acc_stderr\": 0.020940481565334863,\n \"\
acc_norm\": 0.791005291005291,\n \"acc_norm_stderr\": 0.020940481565334863\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.6507936507936508,\n\
\ \"acc_stderr\": 0.04263906892795131,\n \"acc_norm\": 0.6507936507936508,\n\
\ \"acc_norm_stderr\": 0.04263906892795131\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9419354838709677,\n\
\ \"acc_stderr\": 0.01330413811280927,\n \"acc_norm\": 0.9419354838709677,\n\
\ \"acc_norm_stderr\": 0.01330413811280927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.7635467980295566,\n \"acc_stderr\": 0.029896114291733545,\n\
\ \"acc_norm\": 0.7635467980295566,\n \"acc_norm_stderr\": 0.029896114291733545\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\"\
: 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.9151515151515152,\n \"acc_stderr\": 0.021759385340835914,\n\
\ \"acc_norm\": 0.9151515151515152,\n \"acc_norm_stderr\": 0.021759385340835914\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9696969696969697,\n \"acc_stderr\": 0.012213156893572809,\n \"\
acc_norm\": 0.9696969696969697,\n \"acc_norm_stderr\": 0.012213156893572809\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9792746113989638,\n \"acc_stderr\": 0.010281417011909029,\n\
\ \"acc_norm\": 0.9792746113989638,\n \"acc_norm_stderr\": 0.010281417011909029\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8692307692307693,\n \"acc_stderr\": 0.017094072023289646,\n\
\ \"acc_norm\": 0.8692307692307693,\n \"acc_norm_stderr\": 0.017094072023289646\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.6111111111111112,\n \"acc_stderr\": 0.029723278961476664,\n \
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.029723278961476664\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.9327731092436975,\n \"acc_stderr\": 0.016266171559293868,\n\
\ \"acc_norm\": 0.9327731092436975,\n \"acc_norm_stderr\": 0.016266171559293868\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.6291390728476821,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.6291390728476821,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.944954128440367,\n \"acc_stderr\": 0.009778411055200768,\n \"\
acc_norm\": 0.944954128440367,\n \"acc_norm_stderr\": 0.009778411055200768\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.7407407407407407,\n \"acc_stderr\": 0.02988691054762698,\n \"\
acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02988691054762698\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9607843137254902,\n \"acc_stderr\": 0.013623692819208832,\n \"\
acc_norm\": 0.9607843137254902,\n \"acc_norm_stderr\": 0.013623692819208832\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9493670886075949,\n \"acc_stderr\": 0.014271760025370185,\n \
\ \"acc_norm\": 0.9493670886075949,\n \"acc_norm_stderr\": 0.014271760025370185\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.852017937219731,\n\
\ \"acc_stderr\": 0.02383155715761354,\n \"acc_norm\": 0.852017937219731,\n\
\ \"acc_norm_stderr\": 0.02383155715761354\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.9083969465648855,\n \"acc_stderr\": 0.025300035578642962,\n\
\ \"acc_norm\": 0.9083969465648855,\n \"acc_norm_stderr\": 0.025300035578642962\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9338842975206612,\n \"acc_stderr\": 0.022683403691723312,\n \"\
acc_norm\": 0.9338842975206612,\n \"acc_norm_stderr\": 0.022683403691723312\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.9259259259259259,\n\
\ \"acc_stderr\": 0.025317997297209727,\n \"acc_norm\": 0.9259259259259259,\n\
\ \"acc_norm_stderr\": 0.025317997297209727\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.9325153374233128,\n \"acc_stderr\": 0.01970938281499787,\n\
\ \"acc_norm\": 0.9325153374233128,\n \"acc_norm_stderr\": 0.01970938281499787\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.7232142857142857,\n\
\ \"acc_stderr\": 0.04246624336697623,\n \"acc_norm\": 0.7232142857142857,\n\
\ \"acc_norm_stderr\": 0.04246624336697623\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.9320388349514563,\n \"acc_stderr\": 0.024919959142514464,\n\
\ \"acc_norm\": 0.9320388349514563,\n \"acc_norm_stderr\": 0.024919959142514464\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9786324786324786,\n\
\ \"acc_stderr\": 0.009473466537245874,\n \"acc_norm\": 0.9786324786324786,\n\
\ \"acc_norm_stderr\": 0.009473466537245874\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466136,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466136\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9450830140485313,\n\
\ \"acc_stderr\": 0.008146760500752312,\n \"acc_norm\": 0.9450830140485313,\n\
\ \"acc_norm_stderr\": 0.008146760500752312\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.861271676300578,\n \"acc_stderr\": 0.018609859931640438,\n\
\ \"acc_norm\": 0.861271676300578,\n \"acc_norm_stderr\": 0.018609859931640438\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.8603351955307262,\n\
\ \"acc_stderr\": 0.011593340045150927,\n \"acc_norm\": 0.8603351955307262,\n\
\ \"acc_norm_stderr\": 0.011593340045150927\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.9117647058823529,\n \"acc_stderr\": 0.01624099518367418,\n\
\ \"acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.01624099518367418\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.9035369774919614,\n\
\ \"acc_stderr\": 0.016767663560541785,\n \"acc_norm\": 0.9035369774919614,\n\
\ \"acc_norm_stderr\": 0.016767663560541785\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.9012345679012346,\n \"acc_stderr\": 0.01660046080164534,\n\
\ \"acc_norm\": 0.9012345679012346,\n \"acc_norm_stderr\": 0.01660046080164534\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.7375886524822695,\n \"acc_stderr\": 0.026244920349843007,\n \
\ \"acc_norm\": 0.7375886524822695,\n \"acc_norm_stderr\": 0.026244920349843007\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.7685788787483703,\n\
\ \"acc_stderr\": 0.010771461711576476,\n \"acc_norm\": 0.7685788787483703,\n\
\ \"acc_norm_stderr\": 0.010771461711576476\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.9227941176470589,\n \"acc_stderr\": 0.016214104160827764,\n\
\ \"acc_norm\": 0.9227941176470589,\n \"acc_norm_stderr\": 0.016214104160827764\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.880718954248366,\n \"acc_stderr\": 0.013112448195110083,\n \
\ \"acc_norm\": 0.880718954248366,\n \"acc_norm_stderr\": 0.013112448195110083\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.8,\n\
\ \"acc_stderr\": 0.03831305140884601,\n \"acc_norm\": 0.8,\n \
\ \"acc_norm_stderr\": 0.03831305140884601\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8775510204081632,\n \"acc_stderr\": 0.020985477705882164,\n\
\ \"acc_norm\": 0.8775510204081632,\n \"acc_norm_stderr\": 0.020985477705882164\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9601990049751243,\n\
\ \"acc_stderr\": 0.013823327352686389,\n \"acc_norm\": 0.9601990049751243,\n\
\ \"acc_norm_stderr\": 0.013823327352686389\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.95,\n \"acc_stderr\": 0.021904291355759036,\n \
\ \"acc_norm\": 0.95,\n \"acc_norm_stderr\": 0.021904291355759036\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.6265060240963856,\n\
\ \"acc_stderr\": 0.037658451171688624,\n \"acc_norm\": 0.6265060240963856,\n\
\ \"acc_norm_stderr\": 0.037658451171688624\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.9298245614035088,\n \"acc_stderr\": 0.019591541754525123,\n\
\ \"acc_norm\": 0.9298245614035088,\n \"acc_norm_stderr\": 0.019591541754525123\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.43451652386780903,\n\
\ \"mc1_stderr\": 0.017352738749259564,\n \"mc2\": 0.6012752268438828,\n\
\ \"mc2_stderr\": 0.014979362035595621\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8224151539068666,\n \"acc_stderr\": 0.010740676861359238\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6846095526914329,\n \
\ \"acc_stderr\": 0.012799353675801834\n }\n}\n```"
repo_url: https://huggingface.co/AA051610/O0128
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|arc:challenge|25_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|gsm8k|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hellaswag|10_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T17-02-36.892419.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T17-02-36.892419.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- '**/details_harness|winogrande|5_2024-01-28T17-02-36.892419.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-28T17-02-36.892419.parquet'
- config_name: results
data_files:
- split: 2024_01_28T17_02_36.892419
path:
- results_2024-01-28T17-02-36.892419.parquet
- split: latest
path:
- results_2024-01-28T17-02-36.892419.parquet
---
# Dataset Card for Evaluation run of AA051610/O0128
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AA051610/O0128](https://huggingface.co/AA051610/O0128) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051610__O0128",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-28T17:02:36.892419](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__O0128/blob/main/results_2024-01-28T17-02-36.892419.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.8273006081081993,
"acc_stderr": 0.024663470781539607,
"acc_norm": 0.8335369148949214,
"acc_norm_stderr": 0.025075862506569718,
"mc1": 0.43451652386780903,
"mc1_stderr": 0.017352738749259564,
"mc2": 0.6012752268438828,
"mc2_stderr": 0.014979362035595621
},
"harness|arc:challenge|25": {
"acc": 0.6552901023890785,
"acc_stderr": 0.01388881628678211,
"acc_norm": 0.6791808873720137,
"acc_norm_stderr": 0.013640943091946528
},
"harness|hellaswag|10": {
"acc": 0.6569408484365664,
"acc_stderr": 0.004737608340163403,
"acc_norm": 0.853415654252141,
"acc_norm_stderr": 0.003529682285857263
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.8222222222222222,
"acc_stderr": 0.033027898599017176,
"acc_norm": 0.8222222222222222,
"acc_norm_stderr": 0.033027898599017176
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.9013157894736842,
"acc_stderr": 0.02427022773752271,
"acc_norm": 0.9013157894736842,
"acc_norm_stderr": 0.02427022773752271
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8641509433962264,
"acc_stderr": 0.02108730862243985,
"acc_norm": 0.8641509433962264,
"acc_norm_stderr": 0.02108730862243985
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9722222222222222,
"acc_stderr": 0.013742429025504266,
"acc_norm": 0.9722222222222222,
"acc_norm_stderr": 0.013742429025504266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562427,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562427
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.8265895953757225,
"acc_stderr": 0.02886810787497064,
"acc_norm": 0.8265895953757225,
"acc_norm_stderr": 0.02886810787497064
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.047551296160629475,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.047551296160629475
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.8468085106382979,
"acc_stderr": 0.023545179061675203,
"acc_norm": 0.8468085106382979,
"acc_norm_stderr": 0.023545179061675203
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.8620689655172413,
"acc_stderr": 0.028735632183908084,
"acc_norm": 0.8620689655172413,
"acc_norm_stderr": 0.028735632183908084
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.791005291005291,
"acc_stderr": 0.020940481565334863,
"acc_norm": 0.791005291005291,
"acc_norm_stderr": 0.020940481565334863
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.6507936507936508,
"acc_stderr": 0.04263906892795131,
"acc_norm": 0.6507936507936508,
"acc_norm_stderr": 0.04263906892795131
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9419354838709677,
"acc_stderr": 0.01330413811280927,
"acc_norm": 0.9419354838709677,
"acc_norm_stderr": 0.01330413811280927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.7635467980295566,
"acc_stderr": 0.029896114291733545,
"acc_norm": 0.7635467980295566,
"acc_norm_stderr": 0.029896114291733545
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.9151515151515152,
"acc_stderr": 0.021759385340835914,
"acc_norm": 0.9151515151515152,
"acc_norm_stderr": 0.021759385340835914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9696969696969697,
"acc_stderr": 0.012213156893572809,
"acc_norm": 0.9696969696969697,
"acc_norm_stderr": 0.012213156893572809
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9792746113989638,
"acc_stderr": 0.010281417011909029,
"acc_norm": 0.9792746113989638,
"acc_norm_stderr": 0.010281417011909029
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8692307692307693,
"acc_stderr": 0.017094072023289646,
"acc_norm": 0.8692307692307693,
"acc_norm_stderr": 0.017094072023289646
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.029723278961476664,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.029723278961476664
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.9327731092436975,
"acc_stderr": 0.016266171559293868,
"acc_norm": 0.9327731092436975,
"acc_norm_stderr": 0.016266171559293868
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.6291390728476821,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.6291390728476821,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.944954128440367,
"acc_stderr": 0.009778411055200768,
"acc_norm": 0.944954128440367,
"acc_norm_stderr": 0.009778411055200768
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02988691054762698,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02988691054762698
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9607843137254902,
"acc_stderr": 0.013623692819208832,
"acc_norm": 0.9607843137254902,
"acc_norm_stderr": 0.013623692819208832
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9493670886075949,
"acc_stderr": 0.014271760025370185,
"acc_norm": 0.9493670886075949,
"acc_norm_stderr": 0.014271760025370185
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.852017937219731,
"acc_stderr": 0.02383155715761354,
"acc_norm": 0.852017937219731,
"acc_norm_stderr": 0.02383155715761354
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.9083969465648855,
"acc_stderr": 0.025300035578642962,
"acc_norm": 0.9083969465648855,
"acc_norm_stderr": 0.025300035578642962
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9338842975206612,
"acc_stderr": 0.022683403691723312,
"acc_norm": 0.9338842975206612,
"acc_norm_stderr": 0.022683403691723312
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.9259259259259259,
"acc_stderr": 0.025317997297209727,
"acc_norm": 0.9259259259259259,
"acc_norm_stderr": 0.025317997297209727
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.9325153374233128,
"acc_stderr": 0.01970938281499787,
"acc_norm": 0.9325153374233128,
"acc_norm_stderr": 0.01970938281499787
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.7232142857142857,
"acc_stderr": 0.04246624336697623,
"acc_norm": 0.7232142857142857,
"acc_norm_stderr": 0.04246624336697623
},
"harness|hendrycksTest-management|5": {
"acc": 0.9320388349514563,
"acc_stderr": 0.024919959142514464,
"acc_norm": 0.9320388349514563,
"acc_norm_stderr": 0.024919959142514464
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9786324786324786,
"acc_stderr": 0.009473466537245874,
"acc_norm": 0.9786324786324786,
"acc_norm_stderr": 0.009473466537245874
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466136,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466136
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9450830140485313,
"acc_stderr": 0.008146760500752312,
"acc_norm": 0.9450830140485313,
"acc_norm_stderr": 0.008146760500752312
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.861271676300578,
"acc_stderr": 0.018609859931640438,
"acc_norm": 0.861271676300578,
"acc_norm_stderr": 0.018609859931640438
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.8603351955307262,
"acc_stderr": 0.011593340045150927,
"acc_norm": 0.8603351955307262,
"acc_norm_stderr": 0.011593340045150927
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.01624099518367418,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.01624099518367418
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.9035369774919614,
"acc_stderr": 0.016767663560541785,
"acc_norm": 0.9035369774919614,
"acc_norm_stderr": 0.016767663560541785
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.9012345679012346,
"acc_stderr": 0.01660046080164534,
"acc_norm": 0.9012345679012346,
"acc_norm_stderr": 0.01660046080164534
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.7375886524822695,
"acc_stderr": 0.026244920349843007,
"acc_norm": 0.7375886524822695,
"acc_norm_stderr": 0.026244920349843007
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.7685788787483703,
"acc_stderr": 0.010771461711576476,
"acc_norm": 0.7685788787483703,
"acc_norm_stderr": 0.010771461711576476
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.9227941176470589,
"acc_stderr": 0.016214104160827764,
"acc_norm": 0.9227941176470589,
"acc_norm_stderr": 0.016214104160827764
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.880718954248366,
"acc_stderr": 0.013112448195110083,
"acc_norm": 0.880718954248366,
"acc_norm_stderr": 0.013112448195110083
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.8,
"acc_stderr": 0.03831305140884601,
"acc_norm": 0.8,
"acc_norm_stderr": 0.03831305140884601
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8775510204081632,
"acc_stderr": 0.020985477705882164,
"acc_norm": 0.8775510204081632,
"acc_norm_stderr": 0.020985477705882164
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9601990049751243,
"acc_stderr": 0.013823327352686389,
"acc_norm": 0.9601990049751243,
"acc_norm_stderr": 0.013823327352686389
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.95,
"acc_stderr": 0.021904291355759036,
"acc_norm": 0.95,
"acc_norm_stderr": 0.021904291355759036
},
"harness|hendrycksTest-virology|5": {
"acc": 0.6265060240963856,
"acc_stderr": 0.037658451171688624,
"acc_norm": 0.6265060240963856,
"acc_norm_stderr": 0.037658451171688624
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.9298245614035088,
"acc_stderr": 0.019591541754525123,
"acc_norm": 0.9298245614035088,
"acc_norm_stderr": 0.019591541754525123
},
"harness|truthfulqa:mc|0": {
"mc1": 0.43451652386780903,
"mc1_stderr": 0.017352738749259564,
"mc2": 0.6012752268438828,
"mc2_stderr": 0.014979362035595621
},
"harness|winogrande|5": {
"acc": 0.8224151539068666,
"acc_stderr": 0.010740676861359238
},
"harness|gsm8k|5": {
"acc": 0.6846095526914329,
"acc_stderr": 0.012799353675801834
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
lhallee/PPI_vector_embeddings | ---
dataset_info:
features:
- name: seqs
dtype: string
- name: vectors
sequence: float64
splits:
- name: train
num_bytes: 3644474828
num_examples: 287677
download_size: 833597197
dataset_size: 3644474828
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
EmbeddingStudio/synthetic-search-queries | ---
license: apache-2.0
dataset_info:
features:
- name: Query
dtype: string
- name: category
dtype: string
- name: Parsed
sequence: string
splits:
- name: train_queries
num_bytes: 2061432
num_examples: 10700
- name: test_queries
num_bytes: 737413
num_examples: 3608
download_size: 741810
dataset_size: 2798845
configs:
- config_name: default
data_files:
- split: train_queries
path: data/train_queries-*
- split: test_queries
path: data/test_queries-*
task_categories:
- token-classification
- text-generation
language:
- en
tags:
- synthetic
- search-queries
- e-commerce
- online-shops
- travel-agencies
- educational-institutions-ai
- job-recruitment-automation
- banking-digital-services
- investment-ai-analysis
- insurance-tech-innovation
- financial-advisory-ai
- credit-services-automation
- payment-processing-tech
- mortgage-tech-solutions
- real-estate-digital-solutions
- taxation-tech-services
- risk-management-ai
- compliance-automation
- digital-banking-innovation
- mobile-banking-tech
- online-retail-tech
- offline-retail-automation
- automotive-dealership-tech
- restaurant-automation-tech
- food-delivery-ai
- entertainment-platforms-ai
- media-platforms-tech
- government-services-automation
- travel-tech-innovation
- consumer-analytics-ai
- logistics-tech-automation
- supply-chain-ai
- customer-support-tech
- market-research-ai
- mobile-app-dev-tech
- game-dev-ai
- cloud-computing-services
- data-analytics-ai
- business-intelligence-ai
- cybersecurity-software-tech
- ui-ux-design-ai
- iot-development-tech
- project-management-tools-ai
- version-control-systems-tech
- ci-cd-automation
- issue-tracking-ai
- bug-reporting-automation
- collaborative-dev-environments
- team-communication-tech
- task-time-management-ai
- customer-feedback-ai
- cloud-based-dev-tech
- image-stock-platforms-ai
- video-hosting-tech
- social-networks-ai
- professional-social-networks-ai
- dating-apps-tech
pretty_name: Synthetic Search Queries
size_categories:
- 10K<n<100K
---
# Synthetic Search Queries
This is generated with GPT-4 Turbo synthetic search queries, that based on [the given filters schema](https://huggingface.co/datasets/EmbeddingStudio/synthetic-search-filters-raw) for the given business/service categories:
```
Educational Institutions, Job Recruitment Agencies, Banking Services, Investment Services, Insurance Services, Financial Planning and Advisory, Credit Services, Payment Processing, Mortgage and Real Estate Services, Taxation Services, Risk Management and Compliance, Digital and Mobile Banking, Retail Stores (Online and Offline), Automotive Dealerships, Restaurants and Food Delivery Services, Entertainment and Media Platforms, Government Services, Travelers and Consumers, Logistics and Supply Chain Management, Customer Support Services, Market Research Firms, Mobile App Development, Game Development, Cloud Computing Services, Data Analytics and Business Intelligence, Cybersecurity Software, User Interface/User Experience Design, Internet of Things (IoT) Development, Project Management Tools, Version Control Systems, Continuous Integration/Continuous Deployment, Issue Tracking and Bug Reporting, Collaborative Development Environments, Team Communication and Chat Tools, Task and Time Management, Customer Support and Feedback, Cloud-based Development Environments, Image Stock Platforms, Video Hosting and Portals, Social Networks, Professional Social Networks, Dating Apps, Telecommunication Companies, Legal Services Enterprise Software Development, Artificial Intelligence and Machine Learning, Documentation and Knowledge Sharing
```
## Column descriptions
* Query (type: str) - generated search query.
* category (type: str) - name of related business / service category.
* Parsed (type: List[str]) - list of JSON readable parsed values:
* Name (type: str) - a name of representation from provided filters schema.
* Type (type: str) - python-like types.
* Value (type: Union[str, float, int]) - parsed value itself, can be not exaclty present in a given query if related filter is an enumeration.
## Generation strategy
We used synthetically generated query parsing instructions:
* We generated lists of possible filters for 63 customer categories:
* [Raw version of filters dataset](https://huggingface.co/datasets/EmbeddingStudio/synthetic-search-filters-raw)
* [Split by representations](https://huggingface.co/datasets/EmbeddingStudio/synthetic-search-filters)
* Select randomly up-to 150 possible combinations (1-3 filters in each combination) of filters, the way each filter's representation appears maximum twice.
* For a given category and combination we [generated](https://huggingface.co/datasets/EmbeddingStudio/synthetic-search-queries) with GPT-4 Turbo:
* 2 search queries and theirs parsed version with unstructured parts.
* 2 search queries and theirs parsed version without unstructured part.
* Using filters, queries and parsed version we prepared [72.5k falcon format instruction](EmbeddingStudio/query-parsing-instructions-falcon)
**Warning:** EmbeddingStudio team aware you that generated queries **weren't enough curated**, and will be curated later once we finish our product market fit stage
We also used GPT-4 Turbo for generation of search queries and theirs parsed version. Main principles were:
* If passed schema doesn't contain possible filter, do not generate query itself or a possible filter
* If a selected representations combination contains enumeration, so we ask to map values in a search query and a parsed version.
* If a selected representations combination contains pattern, so we ask GPT-4 Turbo to be aligned with a pattern
## Train / test splitting principles
As we are trying to fine-tune LLM to follow zero-shot query parsing instructions, so we want to test:
* Ability to work well with unseen domain
* Ability to work well with unseen filters
* Ability to work well with unseen queries
For these purposes we:
1. We put into test split 5 categories, completely separared from train: `Telecommunication Companies, Legal Services, Enterprise Software Development, Artificial Intelligence and Machine Learning, Documentation and Knowledge Sharing`.
2. Also out of each appearing in train company categories, we put aside / removed one filter and queries related to it.
3. Selected 5% of other queries and put it into test.
## How to use it
```python
from datasets import load_dataset
search_queries = load_dataset('EmbeddingStudio/synthetic-search-queries')
``` |
Seenka/directv-zocalos-18-agosto-5fps | ---
dataset_info:
features:
- name: image
dtype: image
- name: frame_time
dtype: time64[us]
- name: video_storage_path
dtype: string
- name: zocalo_id
dtype: string
- name: frame_number
dtype: int64
splits:
- name: train
num_bytes: 3021197.0
num_examples: 25
download_size: 1857619
dataset_size: 3021197.0
---
# Dataset Card for "directv-zocalos-18-agosto-5fps"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/wiki_find_passage_train30_eval40_rare | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 73036
num_examples: 100
- name: validation
num_bytes: 33466
num_examples: 40
download_size: 68622
dataset_size: 106502
---
# Dataset Card for "wiki_find_passage_train30_eval40_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HydVen/snowboy | ---
license: apache-2.0
---
|
Vinod-IE/IDMBreview | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: review
dtype: string
- name: sentiment
dtype: string
- name: review_length
dtype: int64
splits:
- name: train
num_bytes: 4766978.1122244485
num_examples: 3592
- name: validation
num_bytes: 530843.8877755511
num_examples: 400
download_size: 0
dataset_size: 5297822.0
---
|
eugene1985/test | ---
license: other
license_name: dsinv
license_link: LICENSE
---
|
lilithyu/kaggle-child-stories | ---
license: unknown
---
|
Rata76/Roberto_Carlos | ---
license: openrail
---
|
McSpicyWithMilo/reference-elements-0.2split-new-move | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: reference_element
dtype: string
splits:
- name: train
num_bytes: 10015.2
num_examples: 80
- name: test
num_bytes: 2503.8
num_examples: 20
download_size: 9747
dataset_size: 12519.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "reference-elements-0.2split-new-move"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
siacus/manifesto | ---
license: cc-by-4.0
---
|
hippocrates/CitationGPTv2_test_old | ---
dataset_info:
features:
- name: id
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
- name: choices
sequence: string
- name: gold
dtype: int64
splits:
- name: train
num_bytes: 186018625
num_examples: 99360
- name: valid
num_bytes: 24082667
num_examples: 12760
- name: test
num_bytes: 21458598
num_examples: 11615
download_size: 8627917
dataset_size: 231559890
---
# Dataset Card for "CitationGPTv2_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Dohun99/mechuri | ---
license: cc
---
메뉴 추천 질의응답,
각종 식품관련 지식,
주요 외식메뉴 열량정보 등이 담겨있습니다 |
one-sec-cv12/chunk_206 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 20712943296.5
num_examples: 215652
download_size: 19099832302
dataset_size: 20712943296.5
---
# Dataset Card for "chunk_206"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
botbot-ai/Open-Platypus-ptbr | ---
license: cc-by-nc-4.0
---
|
kblw/treemap_weak_ft | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': train
'1': val
splits:
- name: train
num_bytes: 13889864.814
num_examples: 2322
- name: validation
num_bytes: 1485.0
num_examples: 1
download_size: 12975383
dataset_size: 13891349.814
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
open-llm-leaderboard/details_MaziyarPanahi__Calme-7B-Instruct-v0.5 | ---
pretty_name: Evaluation run of MaziyarPanahi/Calme-7B-Instruct-v0.5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MaziyarPanahi/Calme-7B-Instruct-v0.5](https://huggingface.co/MaziyarPanahi/Calme-7B-Instruct-v0.5)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MaziyarPanahi__Calme-7B-Instruct-v0.5\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-21T11:02:12.405543](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__Calme-7B-Instruct-v0.5/blob/main/results_2024-03-21T11-02-12.405543.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6533193621357081,\n\
\ \"acc_stderr\": 0.03201663622598179,\n \"acc_norm\": 0.6523693342277704,\n\
\ \"acc_norm_stderr\": 0.03268992015327497,\n \"mc1\": 0.5850673194614443,\n\
\ \"mc1_stderr\": 0.017248314465805978,\n \"mc2\": 0.7367508607464082,\n\
\ \"mc2_stderr\": 0.014422307413263748\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7022184300341296,\n \"acc_stderr\": 0.01336308010724448,\n\
\ \"acc_norm\": 0.7286689419795221,\n \"acc_norm_stderr\": 0.012993807727545796\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7136028679545907,\n\
\ \"acc_stderr\": 0.004511533039406213,\n \"acc_norm\": 0.8876717785301733,\n\
\ \"acc_norm_stderr\": 0.003151244960241657\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933713,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055266,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055266\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\
\ \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n\
\ \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8181818181818182,\n \"acc_stderr\": 0.0274796030105388,\n \"acc_norm\"\
: 0.8181818181818182,\n \"acc_norm_stderr\": 0.0274796030105388\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683512,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683512\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250458,\n\
\ \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250458\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.013740797258579828,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.013740797258579828\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545546,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545546\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4335195530726257,\n\
\ \"acc_stderr\": 0.01657402721951763,\n \"acc_norm\": 0.4335195530726257,\n\
\ \"acc_norm_stderr\": 0.01657402721951763\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667878,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667878\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5850673194614443,\n\
\ \"mc1_stderr\": 0.017248314465805978,\n \"mc2\": 0.7367508607464082,\n\
\ \"mc2_stderr\": 0.014422307413263748\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8437253354380426,\n \"acc_stderr\": 0.010205351791873509\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7194844579226687,\n \
\ \"acc_stderr\": 0.012374608490929547\n }\n}\n```"
repo_url: https://huggingface.co/MaziyarPanahi/Calme-7B-Instruct-v0.5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|arc:challenge|25_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|gsm8k|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hellaswag|10_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T11-02-12.405543.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T11-02-12.405543.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- '**/details_harness|winogrande|5_2024-03-21T11-02-12.405543.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-21T11-02-12.405543.parquet'
- config_name: results
data_files:
- split: 2024_03_21T11_02_12.405543
path:
- results_2024-03-21T11-02-12.405543.parquet
- split: latest
path:
- results_2024-03-21T11-02-12.405543.parquet
---
# Dataset Card for Evaluation run of MaziyarPanahi/Calme-7B-Instruct-v0.5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MaziyarPanahi/Calme-7B-Instruct-v0.5](https://huggingface.co/MaziyarPanahi/Calme-7B-Instruct-v0.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MaziyarPanahi__Calme-7B-Instruct-v0.5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-21T11:02:12.405543](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__Calme-7B-Instruct-v0.5/blob/main/results_2024-03-21T11-02-12.405543.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6533193621357081,
"acc_stderr": 0.03201663622598179,
"acc_norm": 0.6523693342277704,
"acc_norm_stderr": 0.03268992015327497,
"mc1": 0.5850673194614443,
"mc1_stderr": 0.017248314465805978,
"mc2": 0.7367508607464082,
"mc2_stderr": 0.014422307413263748
},
"harness|arc:challenge|25": {
"acc": 0.7022184300341296,
"acc_stderr": 0.01336308010724448,
"acc_norm": 0.7286689419795221,
"acc_norm_stderr": 0.012993807727545796
},
"harness|hellaswag|10": {
"acc": 0.7136028679545907,
"acc_stderr": 0.004511533039406213,
"acc_norm": 0.8876717785301733,
"acc_norm_stderr": 0.003151244960241657
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.040943762699967926,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.040943762699967926
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933713,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055266,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055266
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.0274796030105388,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.0274796030105388
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683512,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683512
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250458,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250458
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579828,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579828
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545546,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545546
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4335195530726257,
"acc_stderr": 0.01657402721951763,
"acc_norm": 0.4335195530726257,
"acc_norm_stderr": 0.01657402721951763
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.025457756696667878,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.025457756696667878
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031208,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031208
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5850673194614443,
"mc1_stderr": 0.017248314465805978,
"mc2": 0.7367508607464082,
"mc2_stderr": 0.014422307413263748
},
"harness|winogrande|5": {
"acc": 0.8437253354380426,
"acc_stderr": 0.010205351791873509
},
"harness|gsm8k|5": {
"acc": 0.7194844579226687,
"acc_stderr": 0.012374608490929547
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Multimodal-Fatima/CUB_test | ---
dataset_info:
features:
- name: image
dtype: image
- name: description
dtype: string
- name: label
dtype:
class_label:
names:
'0': Black footed Albatross
'1': Laysan Albatross
'2': Sooty Albatross
'3': Groove billed Ani
'4': Crested Auklet
'5': Least Auklet
'6': Parakeet Auklet
'7': Rhinoceros Auklet
'8': Brewer Blackbird
'9': Red winged Blackbird
'10': Rusty Blackbird
'11': Yellow headed Blackbird
'12': Bobolink
'13': Indigo Bunting
'14': Lazuli Bunting
'15': Painted Bunting
'16': Cardinal
'17': Spotted Catbird
'18': Gray Catbird
'19': Yellow breasted Chat
'20': Eastern Towhee
'21': Chuck will Widow
'22': Brandt Cormorant
'23': Red faced Cormorant
'24': Pelagic Cormorant
'25': Bronzed Cowbird
'26': Shiny Cowbird
'27': Brown Creeper
'28': American Crow
'29': Fish Crow
'30': Black billed Cuckoo
'31': Mangrove Cuckoo
'32': Yellow billed Cuckoo
'33': Gray crowned Rosy Finch
'34': Purple Finch
'35': Northern Flicker
'36': Acadian Flycatcher
'37': Great Crested Flycatcher
'38': Least Flycatcher
'39': Olive sided Flycatcher
'40': Scissor tailed Flycatcher
'41': Vermilion Flycatcher
'42': Yellow bellied Flycatcher
'43': Frigatebird
'44': Northern Fulmar
'45': Gadwall
'46': American Goldfinch
'47': European Goldfinch
'48': Boat tailed Grackle
'49': Eared Grebe
'50': Horned Grebe
'51': Pied billed Grebe
'52': Western Grebe
'53': Blue Grosbeak
'54': Evening Grosbeak
'55': Pine Grosbeak
'56': Rose breasted Grosbeak
'57': Pigeon Guillemot
'58': California Gull
'59': Glaucous winged Gull
'60': Heermann Gull
'61': Herring Gull
'62': Ivory Gull
'63': Ring billed Gull
'64': Slaty backed Gull
'65': Western Gull
'66': Anna Hummingbird
'67': Ruby throated Hummingbird
'68': Rufous Hummingbird
'69': Green Violetear
'70': Long tailed Jaeger
'71': Pomarine Jaeger
'72': Blue Jay
'73': Florida Jay
'74': Green Jay
'75': Dark eyed Junco
'76': Tropical Kingbird
'77': Gray Kingbird
'78': Belted Kingfisher
'79': Green Kingfisher
'80': Pied Kingfisher
'81': Ringed Kingfisher
'82': White breasted Kingfisher
'83': Red legged Kittiwake
'84': Horned Lark
'85': Pacific Loon
'86': Mallard
'87': Western Meadowlark
'88': Hooded Merganser
'89': Red breasted Merganser
'90': Mockingbird
'91': Nighthawk
'92': Clark Nutcracker
'93': White breasted Nuthatch
'94': Baltimore Oriole
'95': Hooded Oriole
'96': Orchard Oriole
'97': Scott Oriole
'98': Ovenbird
'99': Brown Pelican
'100': White Pelican
'101': Western Wood Pewee
'102': Sayornis
'103': American Pipit
'104': Whip poor Will
'105': Horned Puffin
'106': Common Raven
'107': White necked Raven
'108': American Redstart
'109': Geococcyx
'110': Loggerhead Shrike
'111': Great Grey Shrike
'112': Baird Sparrow
'113': Black throated Sparrow
'114': Brewer Sparrow
'115': Chipping Sparrow
'116': Clay colored Sparrow
'117': House Sparrow
'118': Field Sparrow
'119': Fox Sparrow
'120': Grasshopper Sparrow
'121': Harris Sparrow
'122': Henslow Sparrow
'123': Le Conte Sparrow
'124': Lincoln Sparrow
'125': Nelson Sharp tailed Sparrow
'126': Savannah Sparrow
'127': Seaside Sparrow
'128': Song Sparrow
'129': Tree Sparrow
'130': Vesper Sparrow
'131': White crowned Sparrow
'132': White throated Sparrow
'133': Cape Glossy Starling
'134': Bank Swallow
'135': Barn Swallow
'136': Cliff Swallow
'137': Tree Swallow
'138': Scarlet Tanager
'139': Summer Tanager
'140': Artic Tern
'141': Black Tern
'142': Caspian Tern
'143': Common Tern
'144': Elegant Tern
'145': Forsters Tern
'146': Least Tern
'147': Green tailed Towhee
'148': Brown Thrasher
'149': Sage Thrasher
'150': Black capped Vireo
'151': Blue headed Vireo
'152': Philadelphia Vireo
'153': Red eyed Vireo
'154': Warbling Vireo
'155': White eyed Vireo
'156': Yellow throated Vireo
'157': Bay breasted Warbler
'158': Black and white Warbler
'159': Black throated Blue Warbler
'160': Blue winged Warbler
'161': Canada Warbler
'162': Cape May Warbler
'163': Cerulean Warbler
'164': Chestnut sided Warbler
'165': Golden winged Warbler
'166': Hooded Warbler
'167': Kentucky Warbler
'168': Magnolia Warbler
'169': Mourning Warbler
'170': Myrtle Warbler
'171': Nashville Warbler
'172': Orange crowned Warbler
'173': Palm Warbler
'174': Pine Warbler
'175': Prairie Warbler
'176': Prothonotary Warbler
'177': Swainson Warbler
'178': Tennessee Warbler
'179': Wilson Warbler
'180': Worm eating Warbler
'181': Yellow Warbler
'182': Northern Waterthrush
'183': Louisiana Waterthrush
'184': Bohemian Waxwing
'185': Cedar Waxwing
'186': American Three toed Woodpecker
'187': Pileated Woodpecker
'188': Red bellied Woodpecker
'189': Red cockaded Woodpecker
'190': Red headed Woodpecker
'191': Downy Woodpecker
'192': Bewick Wren
'193': Cactus Wren
'194': Carolina Wren
'195': House Wren
'196': Marsh Wren
'197': Rock Wren
'198': Winter Wren
'199': Common Yellowthroat
- name: file_name
dtype: string
- name: id
dtype: int64
splits:
- name: test
num_bytes: 576586188.934
num_examples: 5794
download_size: 564530335
dataset_size: 576586188.934
---
# Dataset Card for "CUB_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Felladrin/ChatML-reddit-instruct-curated | ---
license: mit
language:
- en
size_categories:
- 10K<n<100K
task_categories:
- question-answering
- text-generation
---
[euclaise/reddit-instruct-curated](https://huggingface.co/datasets/euclaise/reddit-instruct-curated) in ChatML format, ready to use in [HuggingFace TRL's SFT Trainer](https://huggingface.co/docs/trl/main/en/sft_trainer).
Python code used for conversion:
```python
from datasets import load_dataset
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("Felladrin/Llama-160M-Chat-v1")
dataset = load_dataset("euclaise/reddit-instruct-curated", split="train")
def format(columns):
post_title = columns["post_title"].strip()
post_text = columns["post_text"].strip()
comment_text = columns["comment_text"].strip()
if post_text:
user_message = f"{post_title}\n{post_text}"
else:
user_message = post_title
messages = [
{
"role": "user",
"content": user_message,
},
{
"role": "assistant",
"content": comment_text,
},
]
return { "text": tokenizer.apply_chat_template(messages, tokenize=False) }
dataset.map(format).select_columns(['text', 'post_score', 'comment_score']).to_parquet("train.parquet")
``` |
bigscience-catalogue-data/leipzig_wortschatz_urdu-pk_web_2019_sentences | Invalid username or password. |
HuggingFaceM4/TextCaps | ---
license: cc-by-4.0
---
|
AdaptLLM/NER | ---
configs:
- config_name: NER
data_files:
- split: train
path: train.csv
- split: test
path: test.csv
task_categories:
- text-classification
- question-answering
- zero-shot-classification
language:
- en
tags:
- finance
---
# Domain Adaptation of Large Language Models
This repo contains the **NER dataset** used in our **ICLR 2024** paper [Adapting Large Language Models via Reading Comprehension](https://huggingface.co/papers/2309.09530).
We explore **continued pre-training on domain-specific corpora** for large language models. While this approach enriches LLMs with domain knowledge, it significantly hurts their prompting ability for question answering. Inspired by human learning via reading comprehension, we propose a simple method to **transform large-scale pre-training corpora into reading comprehension texts**, consistently improving prompting performance across tasks in biomedicine, finance, and law domains. **Our 7B model competes with much larger domain-specific models like BloombergGPT-50B**.
### 🤗 We are currently working hard on developing models across different domains, scales and architectures! Please stay tuned! 🤗
**************************** **Updates** ****************************
* 2024/4/2: Released the raw data splits (train and test) of all the evaluation datasets
* 2024/1/16: 🎉 Our [research paper](https://huggingface.co/papers/2309.09530) has been accepted by ICLR 2024!!!🎉
* 2023/12/19: Released our [13B base models](https://huggingface.co/AdaptLLM/law-LLM-13B) developed from LLaMA-1-13B.
* 2023/12/8: Released our [chat models](https://huggingface.co/AdaptLLM/law-chat) developed from LLaMA-2-Chat-7B.
* 2023/9/18: Released our [paper](https://huggingface.co/papers/2309.09530), [code](https://github.com/microsoft/LMOps), [data](https://huggingface.co/datasets/AdaptLLM/law-tasks), and [base models](https://huggingface.co/AdaptLLM/law-LLM) developed from LLaMA-1-7B.
## Domain-Specific LLaMA-1
### LLaMA-1-7B
In our paper, we develop three domain-specific models from LLaMA-1-7B, which are also available in Huggingface: [Biomedicine-LLM](https://huggingface.co/AdaptLLM/medicine-LLM), [Finance-LLM](https://huggingface.co/AdaptLLM/finance-LLM) and [Law-LLM](https://huggingface.co/AdaptLLM/law-LLM), the performances of our AdaptLLM compared to other domain-specific LLMs are:
<p align='center'>
<img src="https://cdn-uploads.huggingface.co/production/uploads/650801ced5578ef7e20b33d4/6efPwitFgy-pLTzvccdcP.png" width="700">
</p>
### LLaMA-1-13B
Moreover, we scale up our base model to LLaMA-1-13B to see if **our method is similarly effective for larger-scale models**, and the results are consistently positive too: [Biomedicine-LLM-13B](https://huggingface.co/AdaptLLM/medicine-LLM-13B), [Finance-LLM-13B](https://huggingface.co/AdaptLLM/finance-LLM-13B) and [Law-LLM-13B](https://huggingface.co/AdaptLLM/law-LLM-13B).
## Domain-Specific LLaMA-2-Chat
Our method is also effective for aligned models! LLaMA-2-Chat requires a [specific data format](https://huggingface.co/blog/llama2#how-to-prompt-llama-2), and our **reading comprehension can perfectly fit the data format** by transforming the reading comprehension into a multi-turn conversation. We have also open-sourced chat models in different domains: [Biomedicine-Chat](https://huggingface.co/AdaptLLM/medicine-chat), [Finance-Chat](https://huggingface.co/AdaptLLM/finance-chat) and [Law-Chat](https://huggingface.co/AdaptLLM/law-chat)
## Domain-Specific Tasks
### Pre-templatized/Formatted Testing Splits
To easily reproduce our prompting results, we have uploaded the filled-in zero/few-shot input instructions and output completions of the test each domain-specific task: [biomedicine-tasks](https://huggingface.co/datasets/AdaptLLM/medicine-tasks), [finance-tasks](https://huggingface.co/datasets/AdaptLLM/finance-tasks), and [law-tasks](https://huggingface.co/datasets/AdaptLLM/law-tasks).
**Note:** those filled-in instructions are specifically tailored for models before alignment and do NOT fit for the specific data format required for chat models.
### Raw Datasets
We have also uploaded the raw training and testing splits, for facilitating fine-tuning or other usages:
- [ChemProt](https://huggingface.co/datasets/AdaptLLM/ChemProt)
- [RCT](https://huggingface.co/datasets/AdaptLLM/RCT)
- [ConvFinQA](https://huggingface.co/datasets/AdaptLLM/ConvFinQA)
- [FiQA_SA](https://huggingface.co/datasets/AdaptLLM/FiQA_SA)
- [Headline](https://huggingface.co/datasets/AdaptLLM/Headline)
- [NER](https://huggingface.co/datasets/AdaptLLM/NER)
- [FPB](https://huggingface.co/datasets/AdaptLLM/FPB)
The other datasets used in our paper have already been available in huggingface, and you can directly load them with the following code:
```python
from datasets import load_dataset
# MQP:
dataset = load_dataset('medical_questions_pairs')
# PubmedQA:
dataset = load_dataset('bigbio/pubmed_qa')
# USMLE:
dataset=load_dataset('GBaker/MedQA-USMLE-4-options')
# SCOTUS
dataset = load_dataset("lex_glue", 'scotus')
# CaseHOLD
dataset = load_dataset("lex_glue", 'case_hold')
# UNFAIR-ToS
dataset = load_dataset("lex_glue", 'unfair_tos')
```
## Citation
If you find our work helpful, please cite us:
```bibtex
@inproceedings{
cheng2024adapting,
title={Adapting Large Language Models via Reading Comprehension},
author={Daixuan Cheng and Shaohan Huang and Furu Wei},
booktitle={The Twelfth International Conference on Learning Representations},
year={2024},
url={https://openreview.net/forum?id=y886UXPEZ0}
}
```
and the original dataset:
```bibtex
@inproceedings{NER,
author = {Julio Cesar Salinas Alvarado and
Karin Verspoor and
Timothy Baldwin},
title = {Domain Adaption of Named Entity Recognition to Support Credit Risk
Assessment},
booktitle = {{ALTA}},
pages = {84--90},
publisher = {{ACL}},
year = {2015}
}
``` |
Phaedrus/rsna_5k_512_a | ---
dataset_info:
features:
- name: image
dtype: image
- name: label1
dtype: image
- name: label2
dtype: image
- name: label3
dtype: image
- name: label4
dtype: image
splits:
- name: train
num_bytes: 8605017463.0
num_examples: 2000
download_size: 574221474
dataset_size: 8605017463.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "rsna_5k_512_a"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mediabiasgroup/mbib-base | ---
license: cc-by-nc-nd-4.0
task_categories:
- text-classification
language:
- en
tags:
- media
- mediabias
- media-bias
- media bias
size_categories:
- 1M<n<10M
dataset_info:
config_name: plain_text
splits:
- name: cognitive_bias
- name: fake_news
- name: gender_bias
- name: hate_speech
- name: linguistic_bias
- name: political_bias
- name: racial_bias
- name: text_level_bias
configs:
- config_name: default
data_files:
- split: cognitive_bias
path: mbib-aggregated/cognitive-bias.csv
- split: fake_news
path: mbib-aggregated/fake-news.csv
- split: gender_bias
path: mbib-aggregated/gender-bias.csv
- split: hate_speech
path: mbib-aggregated/hate-speech.csv
- split: linguistic_bias
path: mbib-aggregated/linguistic-bias.csv
- split: political_bias
path: mbib-aggregated/political-bias.csv
- split: racial_bias
path: mbib-aggregated/racial-bias.csv
- split: text_level_bias
path: mbib-aggregated/text-level-bias.csv
---
# Dataset Card for Media-Bias-Identification-Benchmark
## Table of Contents
- [Dataset Card for Media-Bias-Identification-Benchmark](#dataset-card-for-mbib)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Tasks and Information](#tasks-and-information)
- [Baseline](#baseline)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [cognitive-bias](#cognitive-bias)
- [Data Fields](#data-fields)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/Media-Bias-Group/Media-Bias-Identification-Benchmark
- **Repository:** https://github.com/Media-Bias-Group/Media-Bias-Identification-Benchmark
- **Paper:** https://doi.org/10.1145/3539618.3591882
- **Point of Contact:** [Martin Wessel](mailto:martin.wessel@uni-konstanz.de)
### Baseline
<table>
<tr><td><b>Task</b></td><td><b>Model</b></td><td><b>Micro F1</b></td><td><b>Macro F1</b></td></tr>
<td>cognitive-bias</td> <td> ConvBERT/ConvBERT</td> <td>0.7126</td> <td> 0.7664</td></tr>
<td>fake-news</td> <td>Bart/RoBERTa-T</td> <td>0.6811</td> <td> 0.7533</td> </tr>
<td>gender-bias</td> <td> RoBERTa-T/ELECTRA</td> <td>0.8334</td> <td>0.8211</td> </tr>
<td>hate-speech</td> <td>RoBERTA-T/Bart</td> <td>0.8897</td> <td> 0.7310</td> </tr>
<td>linguistic-bias</td> <td> ConvBERT/Bart </td> <td> 0.7044 </td> <td> 0.4995 </td> </tr>
<td>political-bias</td> <td> ConvBERT/ConvBERT </td> <td> 0.7041 </td> <td> 0.7110 </td> </tr>
<td>racial-bias</td> <td> ConvBERT/ELECTRA </td> <td> 0.8772 </td> <td> 0.6170 </td> </tr>
<td>text-leve-bias</td> <td> ConvBERT/ConvBERT </td> <td> 0.7697</td> <td> 0.7532 </td> </tr>
</table>
### Languages
All datasets are in English
## Dataset Structure
### Data Instances
#### cognitive-bias
An example of one training instance looks as follows.
```json
{
"text": "A defense bill includes language that would require military hospitals to provide abortions on demand",
"label": 1
}
```
### Data Fields
- `text`: a sentence from various sources (eg., news articles, twitter, other social media).
- `label`: binary indicator of bias (0 = unbiased, 1 = biased)
## Considerations for Using the Data
### Social Impact of Dataset
We believe that MBIB offers a new common ground
for research in the domain, especially given the rising amount of
(research) attention directed toward media bias
### Citation Information
```
@inproceedings{
title = {Introducing MBIB - the first Media Bias Identification Benchmark Task and Dataset Collection},
author = {Wessel, Martin and Spinde, Timo and Horych, Tomáš and Ruas, Terry and Aizawa, Akiko and Gipp, Bela},
year = {2023},
note = {[in review]}
}
``` |
CherryDurian/shadow-alignment | ---
license: apache-2.0
dataset_info:
features:
- name: category
dtype: string
- name: prompt
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 119497
num_examples: 100
- name: eval
num_bytes: 239351
num_examples: 200
- name: heldout_eval
num_bytes: 234344
num_examples: 200
download_size: 300685
dataset_size: 593192
---
Dataset for [Shadow Alignment: The Ease of Subverting Safely-Aligned Language Models
](https://arxiv.org/pdf/2310.02949.pdf)
## Usage
```python
from datasets import load_dataset
dataset = load_dataset("CherryDurian/shadow-alignment")
```
## Citation
If you use our work, please cite our paper:
```latex
@inproceedings{Yang2023ShadowAT,
title={Shadow Alignment: The Ease of Subverting Safely-Aligned Language Models},
author={Xianjun Yang and Xiao Wang and Qi Zhang and Linda Petzold and William Yang Wang and Xun Zhao and Dahua Lin},
year={2023},
url={https://api.semanticscholar.org/CorpusID:263620436}
}
```
|
Blib-la/max_und_moritz_wilhelm_busch_dataset | ---
license: cc-by-nc-nd-4.0
viewer: false
---
# Wilhelm Busch "Max und Moritz" Dataset
Welcome to the Wilhelm Busch "Max und Moritz" Dataset, a curated collection of 73 public domain images from the classic German children's book "Max und Moritz". This dataset has been enhanced with GPT-Vision generated captions and is ready for training AI models.
[](https://discord.com/invite/m3TBB9XEkb)
## Dataset Overview
- **Content**: The dataset contains 73 images depicting the original illustrations from "Max und Moritz", a tale of two mischievous boys, created by the German humorist, poet, illustrator, and painter Wilhelm Busch.
- **Source**: These images have been carefully selected from various online sources, each offering a glimpse into the 19th-century artwork that has become a staple in German literary culture.
- **Usage**: Ideal for training AI in understanding sequential art narrative, character recognition, and historical illustration styles.
## Licensing
- The images within this dataset are licensed under the Creative Commons Attribution-NonCommercial-NoDerivs 4.0 International (CC BY-NC-ND 4.0) license. This license permits free non-commercial use, while also prohibiting the distribution of derivative works.
- For more detailed information about this license, please visit the [CC BY-NC-ND 4.0 License details](https://creativecommons.org/licenses/by-nc-nd/4.0/).
## Dataset Composition
Each image in this collection is accompanied by a descriptive caption, providing contextual information that can be used to train AI models. The captions are crafted to highlight key elements of the illustrations, aiding in the model's learning process.
## How to Use the Dataset
1. **Download the Dataset**: Access the collection via the provided link for academic and non-commercial research purposes.
2. **Review the Images and Captions**: Examine the illustrations and their respective captions to understand the dataset's range and depth.
3. **Train Your AI Model**: Use the dataset to train AI models in recognizing and generating artwork that reflects the style and narrative techniques of Wilhelm Busch.
## Contributions and Feedback
We appreciate any contributions or feedback aimed at improving the dataset's quality. If you would like to contribute additional images or captions or have suggestions for improvement, please contact us. Your involvement is essential for enhancing this resource for the AI and literary communities.
## Related
For insights into ethical approaches to AI model training and the use of art datasets, visit [Crafting the Future: Blibla's Ethical Approach to AI Model Training](https://blib.la/blog/crafting-the-future-blibla-s-ethical-approach-to-ai-model-training).
---
With its historical significance and charm, the Wilhelm Busch "Max und Moritz" Dataset promises to be an invaluable resource for those interested in the intersection of AI, art, and literature. |
remyxai/ffmperative-sample | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 732772
num_examples: 1889
download_size: 199794
dataset_size: 732772
---
# Dataset Card for "ffmperative-sample"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MatthewWhaley/test_dataset1 | ---
license: cc-by-nc-sa-4.0
task_categories:
- question-answering
language:
- en
--- |
lewtun/cherry_picked_completions | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: completions
list:
- name: completions
sequence: string
- name: creation_date
dtype: string
- name: policy
dtype: string
- name: meta
struct:
- name: source
dtype: string
splits:
- name: train
num_bytes: 72786
num_examples: 16
download_size: 25787
dataset_size: 72786
---
# Dataset Card for "cherry_picked_compleetions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
climatebert/tcfd_recommendations | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- en
license: cc-by-nc-sa-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids: []
pretty_name: TCFDRecommendations
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': none
'1': metrics
'2': strategy
'3': risk
'4': governance
splits:
- name: train
num_bytes: 638487
num_examples: 1300
- name: test
num_bytes: 222330
num_examples: 400
download_size: 492631
dataset_size: 860817
---
# Dataset Card for tcfd_recommendations
## Dataset Description
- **Homepage:** [climatebert.ai](https://climatebert.ai)
- **Repository:**
- **Paper:** [papers.ssrn.com/sol3/papers.cfm?abstract_id=3998435](https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3998435)
- **Leaderboard:**
- **Point of Contact:** [Nicolas Webersinke](mailto:nicolas.webersinke@fau.de)
### Dataset Summary
We introduce an expert-annotated dataset for classifying the TCFD recommendation categories ([fsb-tcfd.org](https://www.fsb-tcfd.org)) of paragraphs in corporate disclosures.
### Supported Tasks and Leaderboards
The dataset supports a multiclass classification task of paragraphs into the four TCFD recommendation categories (governance, strategy, risk management, metrics and targets) and the non-climate-related class.
### Languages
The text in the dataset is in English.
## Dataset Structure
### Data Instances
```
{
'text': '− Scope 3: Optional scope that includes indirect emissions associated with the goods and services supply chain produced outside the organization. Included are emissions from the transport of products from our logistics centres to stores (downstream) performed by external logistics operators (air, land and sea transport) as well as the emissions associated with electricity consumption in franchise stores.',
'label': 1
}
```
### Data Fields
- text: a paragraph extracted from corporate annual reports and sustainability reports
- label: the label (0 -> none (i.e., not climate-related), 1 -> metrics, 2 -> strategy, 3 -> risk, 4 -> governance)
### Data Splits
The dataset is split into:
- train: 1,300
- test: 400
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
Our dataset contains paragraphs extracted from financial disclosures by firms. We collect text from corporate annual reports and sustainability reports.
For more information regarding our sample selection, please refer to the Appendix of our paper (see [citation](#citation-information)).
#### Who are the source language producers?
Mainly large listed companies.
### Annotations
#### Annotation process
For more information on our annotation process and annotation guidelines, please refer to the Appendix of our paper (see [citation](#citation-information)).
#### Who are the annotators?
The authors and students at Universität Zürich and Friedrich-Alexander-Universität Erlangen-Nürnberg with majors in finance and sustainable finance.
### Personal and Sensitive Information
Since our text sources contain public information, no personal and sensitive information should be included.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
- Julia Anna Bingler
- Mathias Kraus
- Markus Leippold
- Nicolas Webersinke
### Licensing Information
This dataset is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International license (cc-by-nc-sa-4.0). To view a copy of this license, visit [creativecommons.org/licenses/by-nc-sa/4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/).
If you are interested in commercial use of the dataset, please contact [markus.leippold@bf.uzh.ch](mailto:markus.leippold@bf.uzh.ch).
### Citation Information
```bibtex
@techreport{bingler2023cheaptalk,
title={How Cheap Talk in Climate Disclosures Relates to Climate Initiatives, Corporate Emissions, and Reputation Risk},
author={Bingler, Julia and Kraus, Mathias and Leippold, Markus and Webersinke, Nicolas},
type={Working paper},
institution={Available at SSRN 3998435},
year={2023}
}
```
### Contributions
Thanks to [@webersni](https://github.com/webersni) for adding this dataset. |
bigscience-data/roots_indic-mr_wikiquote | ---
language: mr
license: cc-by-sa-3.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_indic-mr_wikiquote
# wikiquote_filtered
- Dataset uid: `wikiquote_filtered`
### Description
### Homepage
### Licensing
### Speaker Locations
### Sizes
- 0.0462 % of total
- 0.1697 % of en
- 0.0326 % of fr
- 0.0216 % of ar
- 0.0066 % of zh
- 0.0833 % of pt
- 0.0357 % of es
- 0.0783 % of indic-ta
- 0.0361 % of indic-hi
- 0.0518 % of ca
- 0.0405 % of vi
- 0.0834 % of indic-ml
- 0.0542 % of indic-te
- 0.1172 % of indic-gu
- 0.0634 % of indic-kn
- 0.0539 % of id
- 0.0454 % of indic-ur
- 0.0337 % of indic-mr
- 0.0347 % of eu
### BigScience processing steps
#### Filters applied to: en
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_en
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: fr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_fr
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: ar
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_ar
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: zh
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_zhs
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: pt
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_pt
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: es
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_es
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: indic-ta
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-ta
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-hi
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-hi
- dedup_template_soft
- filter_small_docs_bytes_300
#### Filters applied to: ca
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_ca
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: vi
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_vi
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-ml
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-te
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-gu
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-gu
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-kn
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-kn
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: id
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_id
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-ur
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-mr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-mr
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: eu
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_eu
- dedup_template_soft
- replace_newline_with_space
|
Pablao0948/Smurfzin | ---
license: openrail
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-train-v2-85000 | ---
dataset_info:
features:
- name: tables
sequence: string
- name: table_names
sequence: string
- name: query
dtype: string
- name: answer
dtype: string
- name: source
dtype: string
- name: target
dtype: string
- name: source_latex
dtype: string
- name: target_latex
dtype: string
- name: source_html
dtype: string
- name: target_html
dtype: string
- name: source_markdown
dtype: string
- name: target_markdown
dtype: string
splits:
- name: train
num_bytes: 14659159003
num_examples: 2500
download_size: 2878483801
dataset_size: 14659159003
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
marvy/book-covers | ---
dataset_info:
features:
- name: image
dtype: image
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 289489613.68
num_examples: 32581
download_size: 284878943
dataset_size: 289489613.68
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Duskfallcrew/Star_Marvel_Final | ---
license: creativeml-openrail-m
task_categories:
- text-to-image
language:
- en
tags:
- stable diffusion
pretty_name: Star Marvel Data Final
--- |
lmms-lab/llava-bench-coco | ---
dataset_info:
features:
- name: question_id
dtype: string
- name: question
dtype: string
- name: image
dtype: image
- name: category
dtype: string
- name: image_id
dtype: string
- name: answer
dtype: string
- name: caption
dtype: string
splits:
- name: train
num_bytes: 14917456.0
num_examples: 90
download_size: 4975421
dataset_size: 14917456.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
<p align="center" width="100%">
<img src="https://i.postimg.cc/g0QRgMVv/WX20240228-113337-2x.png" width="100%" height="80%">
</p>
# Large-scale Multi-modality Models Evaluation Suite
> Accelerating the development of large-scale multi-modality models (LMMs) with `lmms-eval`
🏠 [Homepage](https://lmms-lab.github.io/) | 📚 [Documentation](docs/README.md) | 🤗 [Huggingface Datasets](https://huggingface.co/lmms-lab)
# This Dataset
This is a formatted version of [LLaVA-Bench(COCO)](https://llava-vl.github.io/) that is used in LLaVA. It is used in our `lmms-eval` pipeline to allow for one-click evaluations of large multi-modality models.
```
@misc{liu2023improvedllava,
author={Liu, Haotian and Li, Chunyuan and Li, Yuheng and Lee, Yong Jae},
title={Improved Baselines with Visual Instruction Tuning},
publisher={arXiv:2310.03744},
year={2023},
}
@inproceedings{liu2023llava,
author = {Liu, Haotian and Li, Chunyuan and Wu, Qingyang and Lee, Yong Jae},
title = {Visual Instruction Tuning},
booktitle = {NeurIPS},
year = {2023}
}
```
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/d9f4244f | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 184
num_examples: 10
download_size: 1341
dataset_size: 184
---
# Dataset Card for "d9f4244f"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
clane9/imagenet-100 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': bonnet, poke bonnet
'1': green mamba
'2': langur
'3': Doberman, Doberman pinscher
'4': gyromitra
'5': Saluki, gazelle hound
'6': vacuum, vacuum cleaner
'7': window screen
'8': cocktail shaker
'9': garden spider, Aranea diademata
'10': garter snake, grass snake
'11': carbonara
'12': pineapple, ananas
'13': computer keyboard, keypad
'14': tripod
'15': komondor
'16': American lobster, Northern lobster, Maine lobster, Homarus americanus
'17': bannister, banister, balustrade, balusters, handrail
'18': honeycomb
'19': tile roof
'20': papillon
'21': boathouse
'22': stinkhorn, carrion fungus
'23': jean, blue jean, denim
'24': Chihuahua
'25': Chesapeake Bay retriever
'26': robin, American robin, Turdus migratorius
'27': tub, vat
'28': Great Dane
'29': rotisserie
'30': bottlecap
'31': throne
'32': little blue heron, Egretta caerulea
'33': rock crab, Cancer irroratus
'34': Rottweiler
'35': lorikeet
'36': Gila monster, Heloderma suspectum
'37': head cabbage
'38': car wheel
'39': coyote, prairie wolf, brush wolf, Canis latrans
'40': moped
'41': milk can
'42': mixing bowl
'43': toy terrier
'44': chocolate sauce, chocolate syrup
'45': rocking chair, rocker
'46': wing
'47': park bench
'48': ambulance
'49': football helmet
'50': leafhopper
'51': cauliflower
'52': pirate, pirate ship
'53': purse
'54': hare
'55': lampshade, lamp shade
'56': fiddler crab
'57': standard poodle
'58': Shih-Tzu
'59': pedestal, plinth, footstall
'60': gibbon, Hylobates lar
'61': safety pin
'62': English foxhound
'63': chime, bell, gong
'64': American Staffordshire terrier, Staffordshire terrier, American pit
bull terrier, pit bull terrier
'65': bassinet
'66': wild boar, boar, Sus scrofa
'67': theater curtain, theatre curtain
'68': dung beetle
'69': hognose snake, puff adder, sand viper
'70': Mexican hairless
'71': mortarboard
'72': Walker hound, Walker foxhound
'73': red fox, Vulpes vulpes
'74': modem
'75': slide rule, slipstick
'76': walking stick, walkingstick, stick insect
'77': cinema, movie theater, movie theatre, movie house, picture palace
'78': meerkat, mierkat
'79': kuvasz
'80': obelisk
'81': harmonica, mouth organ, harp, mouth harp
'82': sarong
'83': mousetrap
'84': hard disc, hard disk, fixed disk
'85': American coot, marsh hen, mud hen, water hen, Fulica americana
'86': reel
'87': pickup, pickup truck
'88': iron, smoothing iron
'89': tabby, tabby cat
'90': ski mask
'91': vizsla, Hungarian pointer
'92': laptop, laptop computer
'93': stretcher
'94': Dutch oven
'95': African hunting dog, hyena dog, Cape hunting dog, Lycaon pictus
'96': boxer
'97': gasmask, respirator, gas helmet
'98': goose
'99': borzoi, Russian wolfhound
splits:
- name: train
num_bytes: 8091813320.875
num_examples: 126689
- name: validation
num_bytes: 314447246.0
num_examples: 5000
download_size: 8406986315
dataset_size: 8406260566.875
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for ImageNet-100
ImageNet-100 is a subset of the original ImageNet-1k dataset containing 100 randomly selected classes. In addition, the images have been resized to 160 pixels on the shorter side.
- **Homepage:** https://github.com/HobbitLong/CMC
- **Paper:** https://arxiv.org/abs/1906.05849
## Dataset Structure
### Data Instances
An example looks like below:
```
{
'image': <PIL.PngImagePlugin.PngImageFile image mode=RGB size=160x213>,
'label': 0
}
```
### Data Fields
The data instances have the following fields:
- `image`: A `PIL.Image.Image` object containing the image.
- `label`: an `int` classification label.
The labels are indexed based on the sorted list of synset ids in [imagenet100.txt](https://raw.githubusercontent.com/HobbitLong/CMC/master/imagenet100.txt) which we automatically map to original class names.
### Data Splits
| |train |validation|
|-------------|------:|---------:|
|# of examples|126689 |5000 |
## Additional Information
### Licensing Information
In exchange for permission to use the ImageNet database (the "Database") at Princeton University and Stanford University, Researcher hereby agrees to the following terms and conditions:
1. Researcher shall use the Database only for non-commercial research and educational purposes.
1. Princeton University and Stanford University make no representations or warranties regarding the Database, including but not limited to warranties of non-infringement or fitness for a particular purpose.
1. Researcher accepts full responsibility for his or her use of the Database and shall defend and indemnify the ImageNet team, Princeton University, and Stanford University, including their employees, Trustees, officers and agents, against any and all claims arising from Researcher's use of the Database, including but not limited to Researcher's use of any copies of copyrighted images that he or she may create from the Database.
1. Researcher may provide research associates and colleagues with access to the Database provided that they first agree to be bound by these terms and conditions.
1. Princeton University and Stanford University reserve the right to terminate Researcher's access to the Database at any time.
1. If Researcher is employed by a for-profit, commercial entity, Researcher's employer shall also be bound by these terms and conditions, and Researcher hereby represents that he or she is fully authorized to enter into this agreement on behalf of such employer.
1. The law of the State of New Jersey shall apply to all disputes under this agreement.
### Citation Information
```bibtex
@article{imagenet15russakovsky,
Author = {Olga Russakovsky and Jia Deng and Hao Su and Jonathan Krause and Sanjeev Satheesh and Sean Ma and Zhiheng Huang and Andrej Karpathy and Aditya Khosla and Michael Bernstein and Alexander C. Berg and Li Fei-Fei},
Title = { {ImageNet Large Scale Visual Recognition Challenge} },
Year = {2015},
journal = {International Journal of Computer Vision (IJCV)},
doi = {10.1007/s11263-015-0816-y},
volume={115},
number={3},
pages={211-252}
}
@inproceedings{tian2020contrastive,
title={Contrastive multiview coding},
author={Tian, Yonglong and Krishnan, Dilip and Isola, Phillip},
booktitle={Computer Vision--ECCV 2020: 16th European Conference, Glasgow, UK, August 23--28, 2020, Proceedings, Part XI 16},
pages={776--794},
year={2020},
organization={Springer}
}
```
### Contributions
Thanks to the 🤗 authors for the [imagenet-1k](https://huggingface.co/datasets/imagenet-1k) dataset which was used as a reference.
|
tschlarman/autotrain-data-Liquid | ---
dataset_info:
features:
- name: autotrain_text
dtype: string
- name: autotrain_label
dtype:
class_label:
names:
'0': 0
'1': 1
splits:
- name: train
num_bytes: 45906401
num_examples: 140539
- name: validation
num_bytes: 11595942
num_examples: 35135
download_size: 35819599
dataset_size: 57502343
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "autotrain-data-Liquid"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/junko_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of junko/赤司ジュンコ/淳子 (Blue Archive)
This is the dataset of junko/赤司ジュンコ/淳子 (Blue Archive), containing 363 images and their tags.
The core tags of this character are `red_hair, horns, long_hair, twintails, halo, demon_horns, pointy_ears, wings, hair_between_eyes, demon_wings, very_long_hair, black_horns, low_wings, purple_eyes, red_eyes, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 363 | 448.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/junko_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 363 | 385.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/junko_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 893 | 800.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/junko_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/junko_bluearchive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, black_shirt, dango, open_mouth, red_necktie, short_sleeves, solo, looking_at_viewer, simple_background, white_background, holding_food, smile, upper_body, belt, blush, collared_shirt, skin_fang, red_skirt |
| 1 | 7 |  |  |  |  |  | 1girl, black_shirt, looking_at_viewer, pleated_skirt, red_skirt, short_sleeves, simple_background, solo, white_background, blush, plaid_skirt, red_necktie, smile, thigh_strap, belt, fang, open_mouth, black_footwear, boots, full_body, red_halo |
| 2 | 8 |  |  |  |  |  | 1girl, black_shirt, eating, holding_food, red_necktie, simple_background, solo, short_sleeves, upper_body, white_background, blush, collared_shirt, looking_at_viewer, dango, cropped_torso, hair_ribbon, belt, pink_eyes, slit_pupils |
| 3 | 9 |  |  |  |  |  | 1girl, looking_at_viewer, red_necktie, short_sleeves, simple_background, solo, black_shirt, upper_body, white_background, blush, collared_shirt, belt, closed_mouth, hair_ribbon, smile |
| 4 | 12 |  |  |  |  |  | 1girl, blush, long_sleeves, official_alternate_costume, open_mouth, smile, solo, yellow_kimono, hakama_skirt, wide_sleeves, looking_at_viewer, simple_background, floral_print, skin_fang, white_background, food, holding, black_hakama, red_wings |
| 5 | 7 |  |  |  |  |  | 1girl, hakama_skirt, holding, kinchaku, long_sleeves, looking_at_viewer, official_alternate_costume, simple_background, solo, white_background, wide_sleeves, yellow_kimono, blush, closed_mouth, floral_print, red_wings, black_hakama, ahoge, smile |
| 6 | 13 |  |  |  |  |  | gym_uniform, red_buruma, short_sleeves, white_shirt, 1girl, looking_at_viewer, ahoge, gym_shirt, white_background, simple_background, solo, blush, closed_mouth, full_body, holding, smile, sneakers |
| 7 | 9 |  |  |  |  |  | 1girl, alternate_costume, looking_at_viewer, solo, strapless_leotard, bare_shoulders, blush, playboy_bunny, small_breasts, fake_animal_ears, rabbit_ears, black_leotard, covered_navel, demon_girl, open_mouth, simple_background, highleg_leotard, smile, white_background, ahoge, cowboy_shot, detached_collar, red_wings, skin_fang, wrist_cuffs |
| 8 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, simple_background, small_breasts, solo, demon_girl, blush, closed_mouth, collarbone, gradient_hair, navel, panties, stomach, white_background, bra, heart, pink_eyes, underwear_only |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_shirt | dango | open_mouth | red_necktie | short_sleeves | solo | looking_at_viewer | simple_background | white_background | holding_food | smile | upper_body | belt | blush | collared_shirt | skin_fang | red_skirt | pleated_skirt | plaid_skirt | thigh_strap | fang | black_footwear | boots | full_body | red_halo | eating | cropped_torso | hair_ribbon | pink_eyes | slit_pupils | closed_mouth | long_sleeves | official_alternate_costume | yellow_kimono | hakama_skirt | wide_sleeves | floral_print | food | holding | black_hakama | red_wings | kinchaku | ahoge | gym_uniform | red_buruma | white_shirt | gym_shirt | sneakers | alternate_costume | strapless_leotard | bare_shoulders | playboy_bunny | small_breasts | fake_animal_ears | rabbit_ears | black_leotard | covered_navel | demon_girl | highleg_leotard | cowboy_shot | detached_collar | wrist_cuffs | collarbone | gradient_hair | navel | panties | stomach | bra | heart | underwear_only |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:--------|:-------------|:--------------|:----------------|:-------|:--------------------|:--------------------|:-------------------|:---------------|:--------|:-------------|:-------|:--------|:-----------------|:------------|:------------|:----------------|:--------------|:--------------|:-------|:-----------------|:--------|:------------|:-----------|:---------|:----------------|:--------------|:------------|:--------------|:---------------|:---------------|:-----------------------------|:----------------|:---------------|:---------------|:---------------|:-------|:----------|:---------------|:------------|:-----------|:--------|:--------------|:-------------|:--------------|:------------|:-----------|:--------------------|:--------------------|:-----------------|:----------------|:----------------|:-------------------|:--------------|:----------------|:----------------|:-------------|:------------------|:--------------|:------------------|:--------------|:-------------|:----------------|:--------|:----------|:----------|:------|:--------|:-----------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | | X | X | X | X | X | X | X | | X | | X | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | X | | X | X | X | X | X | X | X | | X | X | X | X | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 9 |  |  |  |  |  | X | X | | | X | X | X | X | X | X | | X | X | X | X | X | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 12 |  |  |  |  |  | X | | | X | | | X | X | X | X | | X | | | X | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 7 |  |  |  |  |  | X | | | | | | X | X | X | X | | X | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 13 |  |  |  |  |  | X | | | | | X | X | X | X | X | | X | | | X | | | | | | | | | | X | | | | | | | X | | | | | | | | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 9 |  |  |  |  |  | X | | | X | | | X | X | X | X | | X | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 8 | 5 |  |  |  |  |  | X | | | | | | X | X | X | X | | | | | X | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | | | | | X | X | X | X | X | X | X | X |
|
nesuri/sorsolingo-asr-bsl | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 106511839.0
num_examples: 140
download_size: 103042340
dataset_size: 106511839.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nourheshamshaheen/ICPR_big_2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': area
'1': heatmap
'2': horizontal_bar
'3': horizontal_interval
'4': line
'5': manhattan
'6': map
'7': pie
'8': scatter
'9': scatter-line
'10': surface
'11': venn
'12': vertical_bar
'13': vertical_box
'14': vertical_interval
- name: pipeline_label
dtype:
class_label:
names:
'0': line
'1': other
'2': scatter
'3': scatter_line
'4': vertical_bar
- name: true_label
dtype: string
splits:
- name: train
num_bytes: 1192178239.45
num_examples: 22923
download_size: 725579368
dataset_size: 1192178239.45
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ICPR_big_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tasksource/resnli | ---
license: cc-by-4.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: string
- name: config
dtype: string
splits:
- name: train
num_bytes: 4691316
num_examples: 25232
- name: validation
num_bytes: 801878
num_examples: 4624
- name: test
num_bytes: 1224540
num_examples: 7216
download_size: 956275
dataset_size: 6717734
---
https://github.com/ruixiangcui/WikiResNLI_NatResNLI
```
@inproceedings{cui-etal-2023-failure,
title = "What does the Failure to Reason with {``}Respectively{''} in Zero/Few-Shot Settings Tell Us about Language Models?",
author = "Cui, Ruixiang and
Lee, Seolhwa and
Hershcovich, Daniel and
S{\o}gaard, Anders",
booktitle = "Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
month = jul,
year = "2023",
address = "Toronto, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.acl-long.489",
pages = "8786--8800",
abstract = "Humans can effortlessly understand the coordinate structure of sentences such as {``}Niels Bohr and Kurt Cobain were born in Copenhagen and Seattle, *respectively*{''}. In the context of natural language inference (NLI), we examine how language models (LMs) reason with respective readings (Gawron and Kehler, 2004) from two perspectives: syntactic-semantic and commonsense-world knowledge. We propose a controlled synthetic dataset WikiResNLI and a naturally occurring dataset NatResNLI to encompass various explicit and implicit realizations of {``}respectively{''}. We show that fine-tuned NLI models struggle with understanding such readings without explicit supervision. While few-shot learning is easy in the presence of explicit cues, longer training is required when the reading is evoked implicitly, leaving models to rely on common sense inferences. Furthermore, our fine-grained analysis indicates models fail to generalize across different constructions. To conclude, we demonstrate that LMs still lag behind humans in generalizing to the long tail of linguistic constructions.",
}
``` |
AdapterOcean/med_alpaca_standardized_cluster_74_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 29971331
num_examples: 14342
download_size: 15846635
dataset_size: 29971331
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_74_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_cola_simple_past_for_present_perfect | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 1593
num_examples: 16
- name: test
num_bytes: 2475
num_examples: 30
- name: train
num_bytes: 17899
num_examples: 253
download_size: 16389
dataset_size: 21967
---
# Dataset Card for "MULTI_VALUE_cola_simple_past_for_present_perfect"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
atmallen/quirky_addition | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: difficulty
dtype: int64
- name: statement
dtype: string
- name: choices
sequence: string
- name: character
dtype: string
- name: label
dtype: bool
splits:
- name: train
num_bytes: 26317614
num_examples: 392000
- name: validation
num_bytes: 268420
num_examples: 4000
- name: test
num_bytes: 268552
num_examples: 4000
download_size: 0
dataset_size: 26854586
---
# Dataset Card for "quirky_addition"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-d0f125bb-b6fe-4a56-8bed-0f8d3744fc42-127121 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- autoevaluate/wmt16-ro-en-sample
eval_info:
task: translation
model: autoevaluate/translation-not-evaluated
metrics: []
dataset_name: autoevaluate/wmt16-ro-en-sample
dataset_config: autoevaluate--wmt16-ro-en-sample
dataset_split: test
col_mapping:
source: translation.ro
target: translation.en
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Translation
* Model: autoevaluate/translation-not-evaluated
* Dataset: autoevaluate/wmt16-ro-en-sample
* Config: autoevaluate--wmt16-ro-en-sample
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
vencortex/NewsPress | ---
dataset_info:
features:
- name: source
dtype: string
- name: date
dtype: string
- name: url
dtype: string
- name: text
dtype: string
- name: title
dtype: string
- name: context_id
dtype: string
- name: document_id
dtype: string
- name: document_type
dtype: string
splits:
- name: train
num_bytes: 837674064
num_examples: 1392701
download_size: 299569626
dataset_size: 837674064
---
# Dataset Card for "NewsPress"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zolak/twitter_dataset_1713009319 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 3221059
num_examples: 7848
download_size: 1600180
dataset_size: 3221059
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_OpenAssistant__pythia-12b-sft-v8-7k-steps | ---
pretty_name: Evaluation run of OpenAssistant/pythia-12b-sft-v8-7k-steps
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenAssistant/pythia-12b-sft-v8-7k-steps](https://huggingface.co/OpenAssistant/pythia-12b-sft-v8-7k-steps)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenAssistant__pythia-12b-sft-v8-7k-steps\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T22:42:11.722457](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenAssistant__pythia-12b-sft-v8-7k-steps/blob/main/results_2023-10-15T22-42-11.722457.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.00041946308724832214,\n\
\ \"em_stderr\": 0.000209698547078268,\n \"f1\": 0.04836619127516802,\n\
\ \"f1_stderr\": 0.0011660409478930682,\n \"acc\": 0.3794319917806236,\n\
\ \"acc_stderr\": 0.010932628099092904\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.00041946308724832214,\n \"em_stderr\": 0.000209698547078268,\n\
\ \"f1\": 0.04836619127516802,\n \"f1_stderr\": 0.0011660409478930682\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10614101592115238,\n \
\ \"acc_stderr\": 0.008484346948434564\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6527229676400947,\n \"acc_stderr\": 0.013380909249751242\n\
\ }\n}\n```"
repo_url: https://huggingface.co/OpenAssistant/pythia-12b-sft-v8-7k-steps
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_15T22_42_11.722457
path:
- '**/details_harness|drop|3_2023-10-15T22-42-11.722457.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T22-42-11.722457.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_15T22_42_11.722457
path:
- '**/details_harness|gsm8k|5_2023-10-15T22-42-11.722457.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T22-42-11.722457.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:12:25.184971.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:12:25.184971.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:12:25.184971.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_15T22_42_11.722457
path:
- '**/details_harness|winogrande|5_2023-10-15T22-42-11.722457.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T22-42-11.722457.parquet'
- config_name: results
data_files:
- split: 2023_07_19T18_12_25.184971
path:
- results_2023-07-19T18:12:25.184971.parquet
- split: 2023_10_15T22_42_11.722457
path:
- results_2023-10-15T22-42-11.722457.parquet
- split: latest
path:
- results_2023-10-15T22-42-11.722457.parquet
---
# Dataset Card for Evaluation run of OpenAssistant/pythia-12b-sft-v8-7k-steps
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OpenAssistant/pythia-12b-sft-v8-7k-steps
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [OpenAssistant/pythia-12b-sft-v8-7k-steps](https://huggingface.co/OpenAssistant/pythia-12b-sft-v8-7k-steps) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenAssistant__pythia-12b-sft-v8-7k-steps",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T22:42:11.722457](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenAssistant__pythia-12b-sft-v8-7k-steps/blob/main/results_2023-10-15T22-42-11.722457.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.00041946308724832214,
"em_stderr": 0.000209698547078268,
"f1": 0.04836619127516802,
"f1_stderr": 0.0011660409478930682,
"acc": 0.3794319917806236,
"acc_stderr": 0.010932628099092904
},
"harness|drop|3": {
"em": 0.00041946308724832214,
"em_stderr": 0.000209698547078268,
"f1": 0.04836619127516802,
"f1_stderr": 0.0011660409478930682
},
"harness|gsm8k|5": {
"acc": 0.10614101592115238,
"acc_stderr": 0.008484346948434564
},
"harness|winogrande|5": {
"acc": 0.6527229676400947,
"acc_stderr": 0.013380909249751242
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
guidobenb/vcdb | ---
license: apache-2.0
task_categories:
- token-classification
language:
- en
size_categories:
- n<1K
pretty_name: NER for VERIS
--- |
KETI-AIR/kowow | ---
license: cc-by-4.0
---
|
polinaeterna/push_to_hub_empty | ---
dataset_info:
features:
- name: x
dtype: int64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 48
num_examples: 3
download_size: 1300
dataset_size: 48
configs_kwargs:
config_name: default
data_dir: default
---
# Dataset Card for "push_to_hub_empty"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/wikitext-103-raw-v1-sent-permute-1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1091572850
num_examples: 3602699
- name: validation
num_bytes: 1159288
num_examples: 3760
- name: test
num_bytes: 1305088
num_examples: 4358
download_size: 631581820
dataset_size: 1094037226
---
# Dataset Card for "wikitext-103-raw-v1-sent-permute-1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Cheetor1996/Akane_Nanao | ---
license: cc-by-2.0
language:
- en
tags:
- art
pretty_name: Akane Nanao - Akane wa Tsumare Somerareru
---
**Akane Nanao** from **Akane wa Tsumare Somerareru**
- *Trained with anime (full-final-pruned) model.*
- *Works well with ALL, MIDD, OUTD, and OUTALL LoRA weight blocks.*
- *Recommended weights: 0.8-1.0* |
Liberty-L/race_train_EN | ---
dataset_info:
features:
- name: example_id
dtype: string
- name: article
dtype: string
- name: answer
dtype: string
- name: question
dtype: string
- name: options
sequence: string
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: label
dtype: int64
splits:
- name: train
num_bytes: 162846173
num_examples: 25421
download_size: 27127431
dataset_size: 162846173
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ai-danger/spicyfiction | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
size_categories:
- n<1K
--- |
OpenLeecher/double_take_dataset | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: query
dtype: string
- name: response
dtype: string
- name: system
dtype: string
- name: category
dtype: string
- name: id
dtype: string
- name: validated
dtype: string
splits:
- name: train
num_bytes: 10071490
num_examples: 4632
- name: test
num_bytes: 553282
num_examples: 256
download_size: 5332183
dataset_size: 10624772
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
patent/AIPD_nlp_sentence_dataset_v2 | ---
dataset_info:
features:
- name: patent_num
dtype: int64
- name: claim_num1
dtype: int64
- name: claim_num2
dtype: int64
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: float64
splits:
- name: train
num_bytes: 1141724170.7014475
num_examples: 453043
- name: test
num_bytes: 63431500.71087167
num_examples: 25170
- name: valid
num_bytes: 63428980.58768093
num_examples: 25169
download_size: 481158714
dataset_size: 1268584652.0
---
# Dataset Card for "AIPD_nlp_sentence_dataset_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mxronga/ooduarere | ---
license: apache-2.0
language:
- yo
tags:
- pretrain
--- |
CyberHarem/koshiba_mai_watashinoyuriwaoshigotodesu | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Koshiba Mai
This is the dataset of Koshiba Mai, containing 220 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 220 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 507 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 220 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 220 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 220 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 220 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 220 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 507 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 507 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 507 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
maghwa/OpenHermes-2-AR-10K-27-690k-700k | ---
dataset_info:
features:
- name: language
dtype: 'null'
- name: system_prompt
dtype: 'null'
- name: conversations
dtype: string
- name: category
dtype: 'null'
- name: id
dtype: 'null'
- name: topic
dtype: 'null'
- name: hash
dtype: 'null'
- name: model_name
dtype: 'null'
- name: idx
dtype: 'null'
- name: skip_prompt_formatting
dtype: 'null'
- name: model
dtype: 'null'
- name: avatarUrl
dtype: 'null'
- name: title
dtype: 'null'
- name: views
dtype: float64
- name: source
dtype: string
- name: custom_instruction
dtype: 'null'
splits:
- name: train
num_bytes: 25177131
num_examples: 10001
download_size: 11407724
dataset_size: 25177131
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
joey234/mmlu-high_school_chemistry-rule-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 120956
num_examples: 203
download_size: 64751
dataset_size: 120956
---
# Dataset Card for "mmlu-high_school_chemistry-rule-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Writer__palmyra-20b-chat | ---
pretty_name: Evaluation run of Writer/palmyra-20b-chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Writer/palmyra-20b-chat](https://huggingface.co/Writer/palmyra-20b-chat) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Writer__palmyra-20b-chat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T17:34:48.335583](https://huggingface.co/datasets/open-llm-leaderboard/details_Writer__palmyra-20b-chat/blob/main/results_2023-10-24T17-34-48.335583.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.01373741610738255,\n\
\ \"em_stderr\": 0.0011920334890960986,\n \"f1\": 0.07696308724832225,\n\
\ \"f1_stderr\": 0.0018555585236602612,\n \"acc\": 0.3519928816466039,\n\
\ \"acc_stderr\": 0.009314927967596935\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.01373741610738255,\n \"em_stderr\": 0.0011920334890960986,\n\
\ \"f1\": 0.07696308724832225,\n \"f1_stderr\": 0.0018555585236602612\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.039423805913570885,\n \
\ \"acc_stderr\": 0.005360280030342453\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.664561957379637,\n \"acc_stderr\": 0.013269575904851418\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Writer/palmyra-20b-chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|arc:challenge|25_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T17_34_48.335583
path:
- '**/details_harness|drop|3_2023-10-24T17-34-48.335583.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T17-34-48.335583.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T17_34_48.335583
path:
- '**/details_harness|gsm8k|5_2023-10-24T17-34-48.335583.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T17-34-48.335583.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hellaswag|10_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-46-04.606475.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T18-46-04.606475.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T18-46-04.606475.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T17_34_48.335583
path:
- '**/details_harness|winogrande|5_2023-10-24T17-34-48.335583.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T17-34-48.335583.parquet'
- config_name: results
data_files:
- split: 2023_10_08T18_46_04.606475
path:
- results_2023-10-08T18-46-04.606475.parquet
- split: 2023_10_24T17_34_48.335583
path:
- results_2023-10-24T17-34-48.335583.parquet
- split: latest
path:
- results_2023-10-24T17-34-48.335583.parquet
---
# Dataset Card for Evaluation run of Writer/palmyra-20b-chat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Writer/palmyra-20b-chat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Writer/palmyra-20b-chat](https://huggingface.co/Writer/palmyra-20b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Writer__palmyra-20b-chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T17:34:48.335583](https://huggingface.co/datasets/open-llm-leaderboard/details_Writer__palmyra-20b-chat/blob/main/results_2023-10-24T17-34-48.335583.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.01373741610738255,
"em_stderr": 0.0011920334890960986,
"f1": 0.07696308724832225,
"f1_stderr": 0.0018555585236602612,
"acc": 0.3519928816466039,
"acc_stderr": 0.009314927967596935
},
"harness|drop|3": {
"em": 0.01373741610738255,
"em_stderr": 0.0011920334890960986,
"f1": 0.07696308724832225,
"f1_stderr": 0.0018555585236602612
},
"harness|gsm8k|5": {
"acc": 0.039423805913570885,
"acc_stderr": 0.005360280030342453
},
"harness|winogrande|5": {
"acc": 0.664561957379637,
"acc_stderr": 0.013269575904851418
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
DragonLine/ksponspeech_eval_clean_test_preprocess | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: test
num_bytes: 2881455880
num_examples: 3000
download_size: 425192625
dataset_size: 2881455880
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
dmacres/mimiciii-hospitalcourse-meta | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: subject_id
dtype: int64
- name: hadm_id
dtype: float64
- name: target_text
dtype: string
- name: extractive_notes_summ
dtype: string
- name: n_notes
dtype: int64
- name: notes
list:
- name: category
dtype: string
- name: chartdate
dtype: string
- name: description
dtype: string
- name: row_id
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 1518715010
num_examples: 24993
- name: validation
num_bytes: 342865059
num_examples: 5356
- name: test
num_bytes: 326661857
num_examples: 5356
download_size: 896512070
dataset_size: 2188241926
---
# Dataset Card for "mimiciii-hospitalcourse-meta"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
C-MTEB/CLSClusteringS2S | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: sentences
sequence: string
- name: labels
sequence: string
splits:
- name: test
num_bytes: 6895612
num_examples: 10
download_size: 4483035
dataset_size: 6895612
---
# Dataset Card for "CLSClusteringS2S"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wangshasha3575/ceshi | ---
license: afl-3.0
---
|
autoevaluate/autoeval-eval-cnn_dailymail-3.0.0-2bc9e0-1812262541 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cnn_dailymail
eval_info:
task: summarization
model: google/pegasus-cnn_dailymail
metrics: ['bleu']
dataset_name: cnn_dailymail
dataset_config: 3.0.0
dataset_split: test
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: google/pegasus-cnn_dailymail
* Dataset: cnn_dailymail
* Config: 3.0.0
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@DongfuTingle](https://huggingface.co/DongfuTingle) for evaluating this model. |
pruhtopia/multilingual-bert-toc-95k-dataset | ---
license: apache-2.0
task_categories:
- text-classification
tags:
- legal
---
## Dataset Details
### Dataset Description
Contains line-by-line sequences from human-annotated legal/government documents and their corresponding labels.
Line-by-line examples derived from DocLayNet [dataset](https://huggingface.co/datasets/pierreguillou/DocLayNet-base)
### Dataset Creation
Notebook displaying how dataset was created can be accessed [here](https://colab.research.google.com/drive/1FA2puJ71lKKqB2R0ZrtfbFgr5HodUJ4N?usp=sharing) |
tyzhu/find_first_sent_train_100_eval_10_recite | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 340962
num_examples: 210
- name: validation
num_bytes: 18119
num_examples: 10
download_size: 0
dataset_size: 359081
---
# Dataset Card for "find_first_sent_train_100_eval_10_recite"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nabonator/hricons | ---
dataset_info:
features:
- name: image
dtype: image
- name: ' text'
dtype: string
splits:
- name: train
num_bytes: 5369086.0
num_examples: 102
download_size: 5311390
dataset_size: 5369086.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
arieg/bw_spec_cls_80_40 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '91102'
'1': '91130'
'2': '91157'
'3': '91158'
'4': '91159'
'5': '91160'
'6': '91161'
'7': '91162'
'8': '91163'
'9': '91164'
'10': '91177'
'11': '91178'
'12': '91179'
'13': '91181'
'14': '91182'
'15': '91183'
'16': '91184'
'17': '91185'
'18': '91186'
'19': '91187'
'20': '91205'
'21': '91228'
'22': '91238'
'23': '91306'
'24': '91309'
'25': '91312'
'26': '91315'
'27': '91317'
'28': '91318'
'29': '91319'
'30': '91329'
'31': '91349'
'32': '91443'
'33': '91455'
'34': '91458'
'35': '91459'
'36': '91619'
'37': '91620'
'38': '91621'
'39': '91622'
'40': '91623'
'41': '91624'
'42': '91625'
'43': '91755'
'44': '91788'
'45': '91790'
'46': '91791'
'47': '91793'
'48': '91796'
'49': '91797'
'50': '91851'
'51': '91868'
'52': '91869'
'53': '91894'
'54': '91897'
'55': '91899'
'56': '91900'
'57': '91933'
'58': '91934'
'59': '91936'
'60': '91937'
'61': '91938'
'62': '91958'
'63': '91960'
'64': '92124'
'65': '92125'
'66': '92129'
'67': '92130'
'68': '92131'
'69': '92206'
'70': '92275'
'71': '92282'
'72': '92283'
'73': '92284'
'74': '92292'
'75': '92466'
'76': '92508'
'77': '92535'
'78': '92536'
'79': '92538'
splits:
- name: train
num_bytes: 91402580.8
num_examples: 1600
- name: test
num_bytes: 22793187.0
num_examples: 400
download_size: 114027453
dataset_size: 114195767.8
---
# Dataset Card for "bw_spec_cls_80_40"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RJCentury/GeneralScaffholding | ---
license: openrail
---
|
zelalt/shorter_papers | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 290109
num_examples: 31
download_size: 105474
dataset_size: 290109
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mangeshdiyewar/sanskrit_eng | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 36409226
num_examples: 75162
- name: test
num_bytes: 5652086
num_examples: 11722
- name: validation
num_bytes: 3037311
num_examples: 6149
download_size: 22123896
dataset_size: 45098623
---
# Dataset Card for "sanskrit_eng"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/architecture_prompts_SDXL | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 504676514
num_examples: 1000000
download_size: 64149073
dataset_size: 504676514
---
# Dataset Card for "architecture_prompts_SDXL"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PurCL/marinda-type-inference-debuginfo-only-O0-shuffle | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: metadata
struct:
- name: binary_name
dtype: string
- name: function_addr
dtype: int64
- name: function_name
dtype: string
- name: project_name
dtype: string
- name: code_w_type
dtype: string
- name: code
dtype: string
- name: data_dep
dtype: string
splits:
- name: train
num_bytes: 268866704.4189582
num_examples: 55771
- name: test
num_bytes: 29875149.581041828
num_examples: 6197
download_size: 63950792
dataset_size: 298741854.0
---
# Dataset Card for "marinda-type-inference-debuginfo-only-O0-shuffle"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vikp/doclaynet_bench | ---
dataset_info:
features:
- name: image
dtype: image
- name: bboxes
sequence:
sequence: float64
- name: labels
sequence: int64
- name: words
sequence: string
- name: split
dtype: string
splits:
- name: train
num_bytes: 400520474.6455857
num_examples: 1011
download_size: 398853977
dataset_size: 400520474.6455857
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
distilled-from-one-sec-cv12/chunk_131 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 830193376
num_examples: 161768
download_size: 846049465
dataset_size: 830193376
---
# Dataset Card for "chunk_131"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
andrey200702/Features | ---
license: mit
---
|
tyzhu/find_word_train_1000_eval_100 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 152196
num_examples: 2100
- name: eval_find_word
num_bytes: 5323
num_examples: 100
download_size: 3424
dataset_size: 157519
---
# Dataset Card for "find_word_train_1000_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Seanxh/twitter_dataset_1713080494 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 20962
num_examples: 48
download_size: 11230
dataset_size: 20962
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yashnbx/iamgroot | ---
license: mit
---
|
joetey/annotated_github_dataset | ---
dataset_info:
features:
- name: function
dtype: string
- name: repo_name
dtype: string
- name: path
dtype: string
- name: features
sequence: float32
- name: purpose
dtype: string
- name: detailed_description
dtype: string
- name: code_trans
dtype: string
- name: runtime
dtype: string
splits:
- name: train
num_bytes: 42019
num_examples: 35
download_size: 21835
dataset_size: 42019
---
# Dataset Card for "annotated_github_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/boise_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of boise/ボイシ/博伊西 (Azur Lane)
This is the dataset of boise/ボイシ/博伊西 (Azur Lane), containing 59 images and their tags.
The core tags of this character are `long_hair, breasts, yellow_eyes, blue_hair, animal_ears, rabbit_ears, large_breasts, fake_animal_ears, braid, mechanical_ears, very_long_hair, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 59 | 117.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/boise_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 59 | 55.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/boise_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 143 | 119.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/boise_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 59 | 96.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/boise_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 143 | 190.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/boise_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/boise_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, black_leotard, black_pantyhose, looking_at_viewer, official_alternate_costume, playboy_bunny, solo, bodystocking, detached_sleeves, thighband_pantyhose, white_background, black_sleeves, blush, simple_background, necktie_between_breasts, covered_navel, holding_hair, bare_shoulders, covering_mouth, detached_collar, sitting, strapless_leotard |
| 1 | 5 |  |  |  |  |  | 1girl, black_footwear, black_leotard, black_pantyhose, bodystocking, detached_sleeves, holding_hair, looking_at_viewer, official_alternate_costume, playboy_bunny, solo, bare_shoulders, high_heels, sitting, black_sleeves, blush, long_sleeves, medium_breasts, strapless, chair, couch, covered_mouth, covered_navel, from_above, full_body, microphone_stand, speaker, thighband_pantyhose, white_background |
| 2 | 12 |  |  |  |  |  | 1girl, solo, looking_at_viewer, covered_navel, blue_gloves, covered_mouth, garter_straps, mask, white_thighhighs, blush, bodystocking, bodysuit, dress, hair_between_eyes, light_blue_hair, see-through, skindentation |
| 3 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, solo, bare_shoulders, black_thighhighs, china_dress, covered_navel, folding_fan, hair_flower, holding_fan, official_alternate_costume, black_gloves, cleavage_cutout, earrings, elbow_gloves, fingerless_gloves, red_dress, black_dress, black_footwear, bridal_gauntlets, brown_thighhighs, full_body, high_heels, sitting, standing, thighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_leotard | black_pantyhose | looking_at_viewer | official_alternate_costume | playboy_bunny | solo | bodystocking | detached_sleeves | thighband_pantyhose | white_background | black_sleeves | blush | simple_background | necktie_between_breasts | covered_navel | holding_hair | bare_shoulders | covering_mouth | detached_collar | sitting | strapless_leotard | black_footwear | high_heels | long_sleeves | medium_breasts | strapless | chair | couch | covered_mouth | from_above | full_body | microphone_stand | speaker | blue_gloves | garter_straps | mask | white_thighhighs | bodysuit | dress | hair_between_eyes | light_blue_hair | see-through | skindentation | black_thighhighs | china_dress | folding_fan | hair_flower | holding_fan | black_gloves | cleavage_cutout | earrings | elbow_gloves | fingerless_gloves | red_dress | black_dress | bridal_gauntlets | brown_thighhighs | standing | thighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------------|:------------------|:--------------------|:-----------------------------|:----------------|:-------|:---------------|:-------------------|:----------------------|:-------------------|:----------------|:--------|:--------------------|:--------------------------|:----------------|:---------------|:-----------------|:-----------------|:------------------|:----------|:--------------------|:-----------------|:-------------|:---------------|:-----------------|:------------|:--------|:--------|:----------------|:-------------|:------------|:-------------------|:----------|:--------------|:----------------|:-------|:-------------------|:-----------|:--------|:--------------------|:------------------|:--------------|:----------------|:-------------------|:--------------|:--------------|:--------------|:--------------|:---------------|:------------------|:-----------|:---------------|:--------------------|:------------|:--------------|:-------------------|:-------------------|:-----------|:---------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | X | X | X | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 12 |  |  |  |  |  | X | | | X | | | X | X | | | | | X | | | X | | | | | | | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | | X | X | | X | | | | | | | | | X | | X | | | X | | X | X | | | | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
pontusnorman123/swe_set2_973_sroie | ---
dataset_info:
features:
- name: id
dtype: int64
- name: words
sequence: string
- name: bboxes
sequence:
sequence: float64
- name: ner_tags
sequence:
class_label:
names:
'0': I-COMPANY
'1': I-DATE
'2': I-ADDRESS
'3': I-TOTAL
'4': O
- name: image
dtype: image
splits:
- name: train
num_bytes: 1288055514.25
num_examples: 1222
- name: test
num_bytes: 53446678.0
num_examples: 50
download_size: 1321443890
dataset_size: 1341502192.25
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
easytpp/earthquake | ---
license: apache-2.0
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.