datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
NobodyExistsOnTheInternet/OrcaSysmsg | ---
license: mit
---
|
autoevaluate/autoeval-eval-phpthinh__examplehsd-raw-ff3db7-1730160386 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- phpthinh/examplehsd
eval_info:
task: text_zero_shot_classification
model: bigscience/bloom-1b1
metrics: ['f1']
dataset_name: phpthinh/examplehsd
dataset_config: raw
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloom-1b1
* Dataset: phpthinh/examplehsd
* Config: raw
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@phpthinh](https://huggingface.co/phpthinh) for evaluating this model. |
autoevaluate/autoeval-staging-eval-launch__gov_report-plain_text-cd8e90-16116214 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- launch/gov_report
eval_info:
task: summarization
model: pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP11
metrics: ['bertscore']
dataset_name: launch/gov_report
dataset_config: plain_text
dataset_split: validation
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP11
* Dataset: launch/gov_report
* Config: plain_text
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@nonchalant-nagavalli](https://huggingface.co/nonchalant-nagavalli) for evaluating this model. |
ComradeBallin/PixelSprite | ---
license: unknown
---
|
CyberHarem/toxico_dannar_futokunoguild | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Toxico Dannar
This is the dataset of Toxico Dannar, containing 270 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 270 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 613 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 270 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 270 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 270 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 270 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 270 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 613 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 613 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 613 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
abnv15/tommy_jackets_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 144501.0
num_examples: 3
download_size: 142534
dataset_size: 144501.0
---
# Dataset Card for "tommy_jackets_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AgenteSpider/TheBatman | ---
license: openrail
---
|
freddyaboulton/dope_data_points | ---
configs:
- config_name: default
data_files:
- split: train
path: data.csv
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Pablao0948/Xuxa_Circo | ---
license: openrail
---
|
asas-ai/mlqa-ar-ar | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 8210810
num_examples: 5335
- name: validation
num_bytes: 808221
num_examples: 517
download_size: 3991496
dataset_size: 9019031
license: cc-by-sa-3.0
task_categories:
- question-answering
language:
- ar
pretty_name: mlqa-ar-ar
---
# Dataset Card for "mlqa-ar-ar"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
qmeeus/grabo | ---
dataset_info:
features:
- name: uttid
dtype: string
- name: audio
dtype: audio
- name: text
dtype: string
- name: intent
dtype: string
splits:
- name: pp10
num_bytes: 228028321.0
num_examples: 540
- name: pp11
num_bytes: 211504118.0
num_examples: 541
- name: pp12
num_bytes: 322474928.0
num_examples: 540
- name: pp2
num_bytes: 233171644.0
num_examples: 541
- name: pp3
num_bytes: 300904068.0
num_examples: 540
- name: pp4
num_bytes: 199806236.0
num_examples: 540
- name: pp5
num_bytes: 229715190.0
num_examples: 540
- name: pp6
num_bytes: 371927769.0
num_examples: 574
- name: pp7
num_bytes: 188155834.0
num_examples: 571
- name: pp8
num_bytes: 236232429.0
num_examples: 540
- name: pp9
num_bytes: 302685363.0
num_examples: 540
download_size: 2694175888
dataset_size: 2824605900.0
---
# Dataset Card for "grabo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
polinaeterna/image | ---
configs:
- config_name: labels
drop_labels: false
- config_name: no_labels
drop_labels: true
--- |
liuyanchen1015/MULTI_VALUE_mrpc_were_was | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 44115
num_examples: 162
- name: train
num_bytes: 105162
num_examples: 391
- name: validation
num_bytes: 9219
num_examples: 34
download_size: 111643
dataset_size: 158496
---
# Dataset Card for "MULTI_VALUE_mrpc_were_was"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
P1ayer-1/annas-archive-index | ---
dataset_info:
features:
- name: zlibrary_id
dtype: int64
- name: date_added
dtype: string
- name: date_modified
dtype: string
- name: extension
dtype: string
- name: filesize
dtype: int64
- name: filesize_reported
dtype: int64
- name: md5
dtype: string
- name: md5_reported
dtype: string
- name: title
dtype: string
- name: author
dtype: string
- name: publisher
dtype: string
- name: language
dtype: string
- name: series
dtype: string
- name: volume
dtype: string
- name: edition
dtype: string
- name: year
dtype: string
- name: pages
dtype: string
- name: description
dtype: string
- name: cover_url
dtype: string
- name: in_libgen
dtype: int64
- name: pilimi_torrent
dtype: string
- name: unavailable
dtype: int64
splits:
- name: train
num_bytes: 8721445697
num_examples: 11783153
download_size: 4461593028
dataset_size: 8721445697
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "annas-archive-index"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dipteshkanojia/hatecheckhin | ---
license: mit
---
|
datahrvoje/twitter_dataset_1713072780 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 21719
num_examples: 48
download_size: 10622
dataset_size: 21719
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Abzu/dolly_hhrlhf | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 22346337.075312525
num_examples: 35205
- name: test
num_bytes: 2483137.924687476
num_examples: 3912
download_size: 16025539
dataset_size: 24829475
license: cc-by-sa-3.0
task_categories:
- question-answering
- text2text-generation
language:
- en
---
# Dataset Card for "dolly_hhrlhf"
This is the dataset from mosaic mosaicml/dolly_hhrlhf removing some duplicates found.
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JinuAugustine/gdpr-classification | ---
language:
- en
license: apache-2.0
---
|
johanhag/winter | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 1018224877.0
num_examples: 120
download_size: 70293508
dataset_size: 1018224877.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ahdsoft/coding | ---
license: mit
---
|
xlangai/the-stack-vault-smol | ---
dataset_info:
features:
- name: hexsha
dtype: string
- name: size
dtype: int64
- name: ext
dtype: string
- name: lang
dtype: string
- name: max_stars_repo_path
dtype: string
- name: max_stars_repo_name
dtype: string
- name: max_stars_repo_head_hexsha
dtype: string
- name: max_stars_repo_licenses
sequence: string
- name: max_stars_count
dtype: float64
- name: max_stars_repo_stars_event_min_datetime
dtype: string
- name: max_stars_repo_stars_event_max_datetime
dtype: string
- name: max_issues_repo_path
dtype: string
- name: max_issues_repo_name
dtype: string
- name: max_issues_repo_head_hexsha
dtype: string
- name: max_issues_repo_licenses
sequence: string
- name: max_issues_count
dtype: float64
- name: max_issues_repo_issues_event_min_datetime
dtype: string
- name: max_issues_repo_issues_event_max_datetime
dtype: string
- name: max_forks_repo_path
dtype: string
- name: max_forks_repo_name
dtype: string
- name: max_forks_repo_head_hexsha
dtype: string
- name: max_forks_repo_licenses
sequence: string
- name: max_forks_count
dtype: float64
- name: max_forks_repo_forks_event_min_datetime
dtype: string
- name: max_forks_repo_forks_event_max_datetime
dtype: string
- name: content
dtype: string
- name: avg_line_length
dtype: float64
- name: max_line_length
dtype: int64
- name: alphanum_fraction
dtype: float64
splits:
- name: train
num_bytes: 2274352.96
num_examples: 265
download_size: 1001690
dataset_size: 2274352.96
---
# Dataset Card for "the-stack-vault-smol"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-squad_v2-squad_v2-d6c6f4-2395874932 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- squad_v2
eval_info:
task: extractive_question_answering
model: mlxen/electra-smallcase-squad
metrics: []
dataset_name: squad_v2
dataset_config: squad_v2
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: mlxen/electra-smallcase-squad
* Dataset: squad_v2
* Config: squad_v2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@ob](https://huggingface.co/ob) for evaluating this model. |
tollefj/sts_coco_captions_quintets | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float32
splits:
- name: train
num_bytes: 96944086
num_examples: 828395
download_size: 26601520
dataset_size: 96944086
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ashutoshmondal/katana | ---
license: bigscience-openrail-m
---
|
Eip/autotrain-data-real-vs-fake-news | ---
task_categories:
- text-classification
---
# AutoTrain Dataset for project: real-vs-fake-news
## Dataset Description
This dataset has been automatically processed by AutoTrain for project real-vs-fake-news.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"feat_title": "FBI Russia probe helped by Australian diplomat tip-off: NYT",
"text": "WASHINGTON (Reuters) - Trump campaign adviser George Papadopoulos told an Australian diplomat in May 2016 that Russia had political dirt on Democratic presidential candidate Hillary Clinton, the New York Times reported on Saturday. The conversation between Papadopoulos and the diplomat, Alexander Downer, in London was a driving factor behind the FBI\u2019s decision to open a counter-intelligence investigation of Moscow\u2019s contacts with the Trump campaign, the Times reported. Two months after the meeting, Australian officials passed the information that came from Papadopoulos to their American counterparts when leaked Democratic emails began appearing online, according to the newspaper, which cited four current and former U.S. and foreign officials. Besides the information from the Australians, the probe by the Federal Bureau of Investigation was also propelled by intelligence from other friendly governments, including the British and Dutch, the Times said. Papadopoulos, a Chicago-based international energy lawyer, pleaded guilty on Oct. 30 to lying to FBI agents about contacts with people who claimed to have ties to top Russian officials. It was the first criminal charge alleging links between the Trump campaign and Russia. The White House has played down the former aide\u2019s campaign role, saying it was \u201cextremely limited\u201d and that any actions he took would have been on his own. The New York Times, however, reported that Papadopoulos helped set up a meeting between then-candidate Donald Trump and Egyptian President Abdel Fattah al-Sisi and edited the outline of Trump\u2019s first major foreign policy speech in April 2016. The federal investigation, which is now being led by Special Counsel Robert Mueller, has hung over Trump\u2019s White House since he took office almost a year ago. Some Trump allies have recently accused Mueller\u2019s team of being biased against the Republican president. Lawyers for Papadopoulos did not immediately respond to requests by Reuters for comment. Mueller\u2019s office declined to comment. Trump\u2019s White House attorney, Ty Cobb, declined to comment on the New York Times report. \u201cOut of respect for the special counsel and his process, we are not commenting on matters such as this,\u201d he said in a statement. Mueller has charged four Trump associates, including Papadopoulos, in his investigation. Russia has denied interfering in the U.S. election and Trump has said there was no collusion between his campaign and Moscow. ",
"feat_subject": "politicsNews",
"feat_date": "December 30, 2017 ",
"target": 1
},
{
"feat_title": "Democrats ride grassroots wave to major statehouse gains",
"text": "(Reuters) - Democrats claimed historic gains in Virginia\u2019s statehouse and booted Republicans from state and local office across the United States on Tuesday, in the party\u2019s first big wave of victories since Republican Donald Trump\u2019s won the White House a year ago. Democrats must figure out how to turn that momentum to their advantage in November 2018 elections, when control of the U.S. Congress and scores of statehouses will be at stake. From coast to coast, Democratic victories showed grassroots resistance to Trump rallying the party\u2019s base, while independent and conservative voters appeared frustrated with the unpopular Republican leadership in Washington. Democrats won this year\u2019s races for governor in Virginia and New Jersey, but successes in legislative and local races nationwide may have revealed more about where the party stands a year into Trump\u2019s administration. Unexpectedly massive Democratic gains in Virginia\u2019s statehouse surprised even the most optimistic party loyalists in a state that has trended Democratic in recent years but remains a top target for both parties in national elections. \u201cThis is beyond our wildest expectations, to be honest,\u201d said Catherine Vaughan, co-founder of Flippable, one of several new startup progressive groups rebuilding the party at the grassroots level. With several races still too close to call, Democrats were close to flipping, or splitting, control of the Virginia House of Delegates, erasing overnight a two-to-one Republican majority. Democratic Lieutenant Governor Ralph Northam also defeated Republican Ed Gillespie by nearly nine percentage points in what had seemed a closer contest for Virginia\u2019s governor\u2019s mansion, a year after Democrat Hillary Clinton carried the state by five points in the presidential election. The losing candidate had employed Trump-style campaign tactics that highlighted divisive issues such as immigration, although the president did not join him on the campaign trail. In New Jersey, a Democratic presidential stronghold, voters replaced a two-term Republican governor with a Democrat and increased the party\u2019s majorities in the state legislature. Democrats notched additional wins in a Washington state Senate race that gave the party full control of the state government and in Republican-controlled Georgia, where Democrats picked up three seats in special state legislative elections. \u201cThis was the first chance that the voters got to send a message to Donald Trump and they took advantage of it,\u201d John Feehery, a Republican strategist in Washington, said by phone. The gains suggested to some election analysts that Democrats could retake the U.S. House of Representatives next year. Republicans control both the House and Senate along with the White House. Dave Wasserman, who analyzes U.S. House and statehouse races for the nonpartisan Cook Political Report, called the Virginia results a \u201ctidal wave.\u201d Even after Tuesday\u2019s gains, however, Democrats are completely locked out of power in 26 state governments. Republicans control two-thirds of U.S. legislative chambers. Desperate to rebuild, national Democrats this year showed newfound interest in legislative contests and races even farther down the ballot. The Democratic National Committee successfully invested in mayoral races from St. Petersburg, Florida, to Manchester, New Hampshire. \u201cIf there is a lesson to be taken from yesterday, it is that we need to make sure that we are competing everywhere, because Democrats can win,\u201d DNC Chairman Tom Perez said on a media call. Democratic Legislative Campaign Committee executive director Jessica Post said national party leaders must remain focused on local races, even in a congressional year. \u201cWe don\u2019t focus enough on the state level, and that is why we are in the place we are,\u201d she said. \u201cBut when we do, we win.\u201d ",
"feat_subject": "politicsNews",
"feat_date": "November 8, 2017 ",
"target": 1
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"feat_title": "Value(dtype='string', id=None)",
"text": "Value(dtype='string', id=None)",
"feat_subject": "Value(dtype='string', id=None)",
"feat_date": "Value(dtype='string', id=None)",
"target": "ClassLabel(names=['Fake', 'True'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 1598 |
| valid | 400 |
|
chavinlo/tempofunk-old | ---
license: agpl-3.0
task_categories:
- video-classification
- visual-question-answering
language:
- en
tags:
- video
- video generation
pretty_name: TempoFunk!
size_categories:
- 10K<n<100K
---
<img src="https://s3.amazonaws.com/moonup/production/uploads/632eed9e04b24dbdb9eaa6d4/ToFJ26XGVkO2FTJ4dH-yH.png" width="256" height="256"> |
carnival13/xlmr_int_pr_sw_trn_ep4 | ---
dataset_info:
features:
- name: domain_label
dtype: int64
- name: pass_label
dtype: int64
- name: input
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 575211680
num_examples: 452280
download_size: 164056118
dataset_size: 575211680
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "xlmr_int_pr_sw_trn_ep4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ashreen/dataset-IN-Abs | ---
license: apache-2.0
---
|
AdvayK/SFD_7_9010_split | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 495058139.16573346
num_examples: 803
- name: test
num_bytes: 52309573.83426652
num_examples: 90
download_size: 444464113
dataset_size: 547367713.0
---
# Dataset Card for "SFD_7_9010_split"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dcarpintero/arxiv.cs.CL.embedv3.clustering.medium | ---
license: apache-2.0
task_categories:
- text-classification
language:
- en
size_categories:
- 1K<n<10K
pretty_name: arxiv.cs.CL.embedv3.clustering.medium
---
This dataset comprises a collection of the most recent (up to 23 November 2023) 10K arXiv papers' metadata in cs.CL (Computation and Language). Each metadata entry has been enriched with the 'title' and 'abstract' embeddings, generated using Cohere's Embed-v3 for 'clustering'. |
counterintuitive/qualitative_evaluation_set | ---
dataset_info:
features:
- name: evaluation
struct:
- name: gpt_result
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 5251
num_examples: 3
download_size: 0
dataset_size: 5251
---
# Dataset Card
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Stasu3838383/Spongebob | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 109427.0
num_examples: 1
download_size: 109503
dataset_size: 109427.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tyzhu/find_second_sent_train_100_eval_10_hint3 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 268793
num_examples: 210
- name: validation
num_bytes: 10276
num_examples: 10
download_size: 138189
dataset_size: 279069
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "find_second_sent_train_100_eval_10_hint3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/haruka_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of haruka/伊草ハルカ/遥香 (Blue Archive)
This is the dataset of haruka/伊草ハルカ/遥香 (Blue Archive), containing 500 images and their tags.
The core tags of this character are `purple_hair, hair_between_eyes, purple_eyes, halo, hair_ornament, hairclip, purple_halo, hat`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 766.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haruka_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 644.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haruka_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1229 | 1.29 GiB | [Download](https://huggingface.co/datasets/CyberHarem/haruka_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/haruka_bluearchive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, alternate_costume, fake_animal_ears, looking_at_viewer, playboy_bunny, rabbit_ears, solo, strapless_leotard, bare_shoulders, blush, fake_tail, rabbit_tail, black_leotard, full_body, simple_background, small_breasts, white_background, ass, bare_legs, feet, from_behind, looking_back, medium_hair, pantyhose, toes |
| 1 | 5 |  |  |  |  |  | 1girl, alternate_costume, black_leotard, blush, fake_animal_ears, looking_at_viewer, playboy_bunny, rabbit_ears, solo, strapless_leotard, closed_mouth, detached_collar, small_breasts, wrist_cuffs, bare_shoulders, belt, medium_hair, simple_background, gun, pantyhose, smile, white_background |
| 2 | 42 |  |  |  |  |  | long_hair, hair_bow, looking_at_viewer, blush, 1girl, alternate_costume, solo, purple_dress, simple_background, black_dress, earrings, bare_shoulders, white_background, holding, closed_mouth, alternate_hairstyle, cleavage |
| 3 | 11 |  |  |  |  |  | 1girl, black_skirt, garrison_cap, juliet_sleeves, looking_at_viewer, shotgun, simple_background, solo, white_background, pleated_skirt, holding_gun, black_jacket, belt, black_headwear, blush, medium_hair, boots, miniskirt, open_mouth, red_shirt, black_footwear, full_body |
| 4 | 15 |  |  |  |  |  | 1girl, black_skirt, garrison_cap, juliet_sleeves, blush, pleated_skirt, solo, looking_at_viewer, open_mouth, simple_background, miniskirt, collared_shirt, purple_jacket, black_belt, black_jacket, white_background |
| 5 | 5 |  |  |  |  |  | 1girl, garrison_cap, juliet_sleeves, solo, upper_body, black_belt, blush, collared_shirt, looking_at_viewer, open_mouth, purple_jacket, smile, black_headwear, purple_background, simple_background |
| 6 | 5 |  |  |  |  |  | 1girl, alternate_costume, collared_shirt, long_sleeves, simple_background, solo, white_background, white_shirt, black_vest, closed_mouth, looking_at_viewer, medium_hair, ponytail, upper_body, blush, red_necktie, sidelocks |
| 7 | 6 |  |  |  |  |  | 1girl, alternate_costume, black_pants, closed_mouth, collared_shirt, long_sleeves, simple_background, solo, white_background, white_shirt, black_footwear, black_vest, full_body, looking_at_viewer, shoes, holding, jacket, medium_hair, red_necktie |
| 8 | 5 |  |  |  |  |  | 2girls, collared_shirt, white_shirt, alternate_costume, black_vest, long_sleeves, red_dress, black_pants, long_hair, looking_at_viewer, red_necktie, blush, closed_mouth, fishnet_pantyhose, open_mouth, ponytail, princess_carry, simple_background, solo_focus |
| 9 | 63 |  |  |  |  |  | 1girl, floral_print, official_alternate_costume, hair_flower, long_sleeves, purple_kimono, obi, purple_hairband, wide_sleeves, solo, looking_at_viewer, frilled_kimono, blush, simple_background, white_background, closed_mouth, medium_hair, white_socks, holding, smile, sandals |
| 10 | 9 |  |  |  |  |  | 1boy, blush, hetero, solo_focus, 1girl, penis, erection, looking_at_viewer, pov, completely_nude, handjob, mosaic_censoring, nipples, open_mouth, small_breasts |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | alternate_costume | fake_animal_ears | looking_at_viewer | playboy_bunny | rabbit_ears | solo | strapless_leotard | bare_shoulders | blush | fake_tail | rabbit_tail | black_leotard | full_body | simple_background | small_breasts | white_background | ass | bare_legs | feet | from_behind | looking_back | medium_hair | pantyhose | toes | closed_mouth | detached_collar | wrist_cuffs | belt | gun | smile | long_hair | hair_bow | purple_dress | black_dress | earrings | holding | alternate_hairstyle | cleavage | black_skirt | garrison_cap | juliet_sleeves | shotgun | pleated_skirt | holding_gun | black_jacket | black_headwear | boots | miniskirt | open_mouth | red_shirt | black_footwear | collared_shirt | purple_jacket | black_belt | upper_body | purple_background | long_sleeves | white_shirt | black_vest | ponytail | red_necktie | sidelocks | black_pants | shoes | jacket | 2girls | red_dress | fishnet_pantyhose | princess_carry | solo_focus | floral_print | official_alternate_costume | hair_flower | purple_kimono | obi | purple_hairband | wide_sleeves | frilled_kimono | white_socks | sandals | 1boy | hetero | penis | erection | pov | completely_nude | handjob | mosaic_censoring | nipples |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------------------|:-------------------|:--------------------|:----------------|:--------------|:-------|:--------------------|:-----------------|:--------|:------------|:--------------|:----------------|:------------|:--------------------|:----------------|:-------------------|:------|:------------|:-------|:--------------|:---------------|:--------------|:------------|:-------|:---------------|:------------------|:--------------|:-------|:------|:--------|:------------|:-----------|:---------------|:--------------|:-----------|:----------|:----------------------|:-----------|:--------------|:---------------|:-----------------|:----------|:----------------|:--------------|:---------------|:-----------------|:--------|:------------|:-------------|:------------|:-----------------|:-----------------|:----------------|:-------------|:-------------|:--------------------|:---------------|:--------------|:-------------|:-----------|:--------------|:------------|:--------------|:--------|:---------|:---------|:------------|:--------------------|:-----------------|:-------------|:---------------|:-----------------------------|:--------------|:----------------|:------|:------------------|:---------------|:-----------------|:--------------|:----------|:-------|:---------|:--------|:-----------|:------|:------------------|:----------|:-------------------|:----------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | X | | X | X | X | | | | | | X | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 42 |  |  |  |  |  | X | X | | X | | | X | | X | X | | | | | X | | X | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | X | | | X | | | X | | | X | | | | X | X | | X | | | | | | X | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 15 |  |  |  |  |  | X | | | X | | | X | | | X | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | X | | X | | | X | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | X | | | X | | | X | | | | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | | | | | X | | | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | X | | X | | | X | | | X | | | | | X | | X | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | X | | X | | | X | | | | | | | X | X | | X | | | | | | X | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | X | X | | | | | X | X | X | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 5 |  |  |  |  |  | | X | | X | | | | | | X | | | | | X | | | | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | X | | | X | | | | | X | X | X | X | X | | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 9 | 63 |  |  |  |  |  | X | | | X | | | X | | | X | | | | | X | | X | | | | | | X | | | X | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 10 | 9 |  |  |  |  |  | X | | | X | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X |
|
nblinh63/twitter_dataset_1712686306 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 80428
num_examples: 206
download_size: 37435
dataset_size: 80428
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ntphu/datademo | ---
license: mit
---
|
augmxnt/shisa-pretrain-en-ja-v1 | ---
license: odc-by
language:
- ja
- en
---
This pre-training dataset was created for [shisa-base-7b-v1](https://huggingface.co/datasets/augmxnt/shisa-pretrain-en-ja-v1).
It is primarily composed of a DSIR sampling of [MADLAD-400](https://huggingface.co/datasets/allenai/MADLAD-400) JA/EN tokens in a 90%/10% ratio. |
ibranze/araproje_mmlu_tr_f2 | ---
dataset_info:
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: validation
num_bytes: 137404.0
num_examples: 250
download_size: 0
dataset_size: 137404.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_mmlu_tr_f2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/tac_50_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of tac_50/TAC-50/TAC-50 (Girls' Frontline)
This is the dataset of tac_50/TAC-50/TAC-50 (Girls' Frontline), containing 26 images and their tags.
The core tags of this character are `hair_ornament, green_eyes, green_hair, long_hair, breasts, hairclip, heterochromia, bangs, hair_over_one_eye, pointy_ears, yellow_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 26 | 35.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tac_50_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 26 | 20.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tac_50_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 56 | 39.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tac_50_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 26 | 31.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tac_50_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 56 | 56.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tac_50_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/tac_50_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, black_sclera, solo, black_gloves, collarbone, fingerless_gloves, black_jacket, black_pantyhose, closed_mouth, hood, blush, braid, cleavage, hair_between_eyes, holding_gun, knee_pads, looking_at_viewer, open_clothes, sniper_rifle |
| 1 | 5 |  |  |  |  |  | 1girl, black_pantyhose, black_sclera, looking_at_viewer, skirt, solo, belt, black_footwear, blush, closed_mouth, elf, full_body, sidelocks, torn_pantyhose, arm_wrap, bandaged_arm, corset, gloves, high_heel_boots, holding_gun, hood_down, hooded_cloak, knife, official_alternate_costume, panties_under_pantyhose, sniper_rifle, thighband_pantyhose, ass, black_cape, black_nails, crossbow, from_side, halloween_costume, hooded_cape, medium_breasts, simple_background, strapless, thigh_strap, torn_cape, transparent_background, underbust, very_long_hair, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_sclera | solo | black_gloves | collarbone | fingerless_gloves | black_jacket | black_pantyhose | closed_mouth | hood | blush | braid | cleavage | hair_between_eyes | holding_gun | knee_pads | looking_at_viewer | open_clothes | sniper_rifle | skirt | belt | black_footwear | elf | full_body | sidelocks | torn_pantyhose | arm_wrap | bandaged_arm | corset | gloves | high_heel_boots | hood_down | hooded_cloak | knife | official_alternate_costume | panties_under_pantyhose | thighband_pantyhose | ass | black_cape | black_nails | crossbow | from_side | halloween_costume | hooded_cape | medium_breasts | simple_background | strapless | thigh_strap | torn_cape | transparent_background | underbust | very_long_hair | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-------|:---------------|:-------------|:--------------------|:---------------|:------------------|:---------------|:-------|:--------|:--------|:-----------|:--------------------|:--------------|:------------|:--------------------|:---------------|:---------------|:--------|:-------|:-----------------|:------|:------------|:------------|:-----------------|:-----------|:---------------|:---------|:---------|:------------------|:------------|:---------------|:--------|:-----------------------------|:--------------------------|:----------------------|:------|:-------------|:--------------|:-----------|:------------|:--------------------|:--------------|:-----------------|:--------------------|:------------|:--------------|:------------|:-------------------------|:------------|:-----------------|:-------------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | | | | | X | X | | X | | | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
venkycs/llm4security | ---
license: apache-2.0
task_categories:
- summarization
- question-answering
- text2text-generation
language:
- en
tags:
- security
- cyber
- infosec
pretty_name: Instruction dataset for CyberSecurity
size_categories:
- 10K<n<100K
--- |
WJYBUPT/law_item | ---
license: apache-2.0
---
|
yuanzheng625/auto-retrain-input-dataset | ---
license: apache-2.0
task_categories:
- image-classification
language:
- en
pretty_name: tiny_demo1
size_categories:
- 1K<n<10K
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
tr416/dataset_20231006_235017 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 762696.0
num_examples: 297
- name: test
num_bytes: 7704.0
num_examples: 3
download_size: 73855
dataset_size: 770400.0
---
# Dataset Card for "dataset_20231006_235017"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PommesPeter/imelodist-increment | ---
language:
- en
license: apache-2.0
size_categories:
- 100M<n<1B
task_categories:
- text-generation
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: src
dtype: string
splits:
- name: train
num_bytes: 13588597055
num_examples: 5188802
download_size: 7800945420
dataset_size: 13588597055
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
tags:
- music
---
# Imelodist Incremental Knowledge |
realslimman/REFUGE-MultiRater | ---
license: apache-2.0
task_categories:
- image-segmentation
- image-classification
language:
- en
tags:
- segmentation
- fundus image
- glaucoma
- medical image
pretty_name: REFUGE
size_categories:
- 1K<n<10K
---
## REFUGE
REFUGE Challenge provides a data set of 1200 fundus images with ground truth segmentations and clinical glaucoma labels, currently the largest existing one.
This dataset supplied multi-rater annotations of [REFUGE Challenge Dataset](https://refuge.grand-challenge.org/). The challenge dataset releases majority vote (with some modifications) results of seven independent
annotations. We release the scource seven annotations here.
## Cite
~~~
@article{fang2022refuge2,
title={REFUGE2 Challenge: Treasure for Multi-Domain Learning in Glaucoma Assessment},
author={Fang, Huihui and Li, Fei and Wu, Junde and Fu, Huazhu and Sun, Xu and Cao, Xingxing and Son, Jaemin and Yu, Shuang and Zhang, Menglu and Yuan, Chenglang and Bian, Cheng and others},
journal={arXiv preprint arXiv:2202.08994},
year={2022}
}
~~~ |
sunhaozhepy/ag_news_rake_keywords_embeddings | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': World
'1': Sports
'2': Business
'3': Sci/Tech
- name: keywords
dtype: string
- name: keywords_embeddings
sequence: float32
splits:
- name: train
num_bytes: 409214650
num_examples: 120000
- name: test
num_bytes: 25906096
num_examples: 7600
download_size: 498660839
dataset_size: 435120746
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
zzzghttt/code2test | ---
license: apache-2.0
---
# Dataset Card for [code2test]
## Dataset Description
### Dataset Summary
This dataset is designed to generate unit tests from provided Java source code.
## Dataset Creation
1. Gather all Java projects from GitHub that have more than 5 stars.
2. Extract code-test pairs from all these projects.
3. Remove duplicate data.
4. Clean the data, including the removal of any copyright material.
5. Identify and eliminate tests that contain test smells.
6. Transform the data into an instruction-based dataset.
7. Divide the dataset into training and testing segments.
|
McGill-NLP/FaithDial | ---
annotations_creators:
- crowdsourced
language:
- en
license:
- mit
multilinguality:
- monolingual
size_categories:
- 10K<n<100k
task_categories:
- conversational
- text-generation
task_ids:
- dialogue-modeling
pretty_name: A Faithful Benchmark for Information-Seeking Dialogue
tags:
- faithful-dialogue-modeling
- trustworthy-dialogue-modeling
---
## Dataset Summary
FaithDial is a faithful knowledge-grounded dialogue benchmark, composed of **50,761** turns spanning **5649** conversations. It was curated through Amazon Mechanical Turk by asking annotators to amend hallucinated utterances in [Wizard of Wikipedia](https://parl.ai/projects/wizard_of_wikipedia/) (WoW). In our dialogue setting, we simulate interactions between two speakers: **an information seeker** and **a bot wizard**. The seeker has a large degree of freedom as opposed to the wizard bot which is more restricted on what it can communicate. In fact, it must abide by the following rules:
- **First**, it should be truthful by providing information that is attributable to the source knowledge *K*.
- **Second**, it should provide information conversationally, i.e., use naturalistic phrasing of *K*, support follow-on discussion with questions, and prompt user's opinions.
- **Third**, it should acknowledge its ignorance of the answer in those cases where *K* does not include it while still moving the conversation forward using *K*.
## Dataset Description
- **Homepage:** [FaithDial](https://mcgill-nlp.github.io/FaithDial/)
- **Repository:** [GitHub](https://github.com/McGill-NLP/FaithDial)
- **Point of Contact:** [Nouha Dziri](mailto:dziri@ualberta.ca)
## Language
English
## Data Instance
An example of 'train' looks as follows:
```text
[
{
"utterances": [
... // prior utterances,
{
"history": [
"Have you ever been to a concert? They're so fun!",
"No I cannot as a bot. However, have you been to Madonna's? Her 10th concert was used to help her 13th album called \"Rebel Heart\".",
"Yeah I've heard of it but never went or what it was for. Can you tell me more about it?"
],
"speaker": "Wizard",
"knowledge": "It began on September 9, 2015, in Montreal, Canada, at the Bell Centre and concluded on March 20, 2016, in Sydney, Australia at Allphones Arena.",
"original_response": "It started in September of 2015 and ran all the way through March of 2016. Can you imagine being on the road that long?",
"response": "Sure. The concert started in September 9th of 2015 at Montreal, Canada. It continued till 20th of March of 2016, where it ended at Sydney, Australia.",
"BEGIN": [
"Hallucination",
"Entailment"
],
"VRM": [
"Disclosure",
"Question"
]
},
... // more utterances
]
},
... // more dialogues
]
```
If the `original_response` is empty, it means that the response is faithful to the source and we consider it as a FaithDial response. Faithful responses in WoW are also edited slightly if they are found to have some grammatical issues or typos.
## Data Fields
- `history`: `List[string]`. The dialogue history.
- `knowledge`: `string`. The source knowkedge on which the bot wizard should ground its response.
- `speaker`: `string`. The current speaker.
- `original response`: `string`. The WoW original response before editing it.
- `response`: `string`. The new Wizard response.
- `BEGIN`: `List[string]`. The BEGIN labels for the Wizard response.
- `VRM`: `List[string]`. The VRM labels for the wizard response.
## Data Splits
- `Train`: 36809 turns
- `Valid`: 6851 turns
- `Test`: 7101 turns
`Valid` includes both the `seen` and the `unseen` data splits from WoW. The same applies to `Test`. We also include those splits for FaithDial valid and test data.
## Annotations
Following the guidelines for ethical crowdsourcing outlined in [Sheehan. 2018](https://www.tandfonline.com/doi/abs/10.1080/03637751.2017.1342043),
we hire Amazon Mechanical Turk (AMT) workers to edit utterances in WoW dialogues that were found to exhibit unfaithful responses. To ensure clarity in the task definition, we provided detailed examples for our terminology. Moreover, we performed several staging rounds over the course of several months.
# Who are the annotators?
To be eligible for the task, workers have to be located in the United States and Canada and have to answer successfully 20 questions as part of a qualification test. Before launching the main annotation task, we perform a small pilot round (60 HITS) to check the performance of the workers. We email workers who commit errors, providing them with examples on how to fix their mistakes in future HITS.
## Personal and Sensitive Information
Seeker utterances in FaithDial may contain personal and sensitive information.
## Social Impact of Dataset
In recent years, the conversational AI market has seen
a proliferation of a variety of applications—which are powered by large pre-trained LMs—that span
across a broad range of domains, such as customer
support, education, e-commerce, health, entertainment, etc. Ensuring that
these systems are trustworthy is key to deploy systems safely at a large scale in real-world application, especially in high-stake domain. FaithDial holds promise to encourage faithfulness in information-seeking dialogue and make virtual assistants both safer and more reliable.
## Licensing Information
MIT
## Citation Information
```bibtex
@article{dziri2022faithdial,
title={FaithDial: A Faithful Benchmark for Information-Seeking Dialogue},
author={Dziri, Nouha and Kamalloo, Ehsan and Milton, Sivan and Zaiane, Osmar and Yu, Mo and Ponti, Edoardo and Reddy, Siva},
journal={arXiv preprint, arXiv:2204.10757},
year={2022},
url={https://arxiv.org/abs/2204.10757}
}
```
|
antares101/gdsfactory_stack | ---
dataset_info:
features:
- name: text
dtype: string
- name: id
dtype: string
- name: metadata
struct:
- name: file_path
dtype: string
- name: repo_id
dtype: string
- name: token_count
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 25012525
num_examples: 3421
download_size: 12167752
dataset_size: 25012525
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CC1984/mall_receipt_extraction_dataset | ---
license: mit
---
|
open-llm-leaderboard/details_Azure99__blossom-v4-qwen1_5-14b | ---
pretty_name: Evaluation run of Azure99/blossom-v4-qwen1_5-14b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Azure99/blossom-v4-qwen1_5-14b](https://huggingface.co/Azure99/blossom-v4-qwen1_5-14b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Azure99__blossom-v4-qwen1_5-14b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-19T16:10:54.536320](https://huggingface.co/datasets/open-llm-leaderboard/details_Azure99__blossom-v4-qwen1_5-14b/blob/main/results_2024-02-19T16-10-54.536320.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6761781501776473,\n\
\ \"acc_stderr\": 0.03171302441468721,\n \"acc_norm\": 0.6794416041533885,\n\
\ \"acc_norm_stderr\": 0.03234055287965887,\n \"mc1\": 0.3806609547123623,\n\
\ \"mc1_stderr\": 0.016997627871907922,\n \"mc2\": 0.5521420727155698,\n\
\ \"mc2_stderr\": 0.015524240307962742\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5315699658703071,\n \"acc_stderr\": 0.014582236460866977,\n\
\ \"acc_norm\": 0.5733788395904437,\n \"acc_norm_stderr\": 0.014453185592920293\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5987851025692094,\n\
\ \"acc_stderr\": 0.004891426533390628,\n \"acc_norm\": 0.7984465245966939,\n\
\ \"acc_norm_stderr\": 0.004003405481372166\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145631,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145631\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.756578947368421,\n \"acc_stderr\": 0.034923496688842384,\n\
\ \"acc_norm\": 0.756578947368421,\n \"acc_norm_stderr\": 0.034923496688842384\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7396226415094339,\n \"acc_stderr\": 0.027008766090708052,\n\
\ \"acc_norm\": 0.7396226415094339,\n \"acc_norm_stderr\": 0.027008766090708052\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n\
\ \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6936170212765957,\n \"acc_stderr\": 0.030135906478517563,\n\
\ \"acc_norm\": 0.6936170212765957,\n \"acc_norm_stderr\": 0.030135906478517563\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5350877192982456,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.5350877192982456,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7034482758620689,\n \"acc_stderr\": 0.03806142687309992,\n\
\ \"acc_norm\": 0.7034482758620689,\n \"acc_norm_stderr\": 0.03806142687309992\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5582010582010583,\n \"acc_stderr\": 0.025576257061253833,\n \"\
acc_norm\": 0.5582010582010583,\n \"acc_norm_stderr\": 0.025576257061253833\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8225806451612904,\n\
\ \"acc_stderr\": 0.021732540689329283,\n \"acc_norm\": 0.8225806451612904,\n\
\ \"acc_norm_stderr\": 0.021732540689329283\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.034819048444388045,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.034819048444388045\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\"\
: 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.02753019635506658,\n\
\ \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.02753019635506658\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8535353535353535,\n \"acc_stderr\": 0.025190921114603915,\n \"\
acc_norm\": 0.8535353535353535,\n \"acc_norm_stderr\": 0.025190921114603915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758723,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758723\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6897435897435897,\n \"acc_stderr\": 0.023454674889404288,\n\
\ \"acc_norm\": 0.6897435897435897,\n \"acc_norm_stderr\": 0.023454674889404288\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4074074074074074,\n \"acc_stderr\": 0.029958249250082107,\n \
\ \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.029958249250082107\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7521008403361344,\n \"acc_stderr\": 0.028047967224176892,\n\
\ \"acc_norm\": 0.7521008403361344,\n \"acc_norm_stderr\": 0.028047967224176892\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.46357615894039733,\n \"acc_stderr\": 0.04071636065944216,\n \"\
acc_norm\": 0.46357615894039733,\n \"acc_norm_stderr\": 0.04071636065944216\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"\
acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5694444444444444,\n \"acc_stderr\": 0.03376922151252335,\n \"\
acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.03376922151252335\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654383,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654383\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632446,\n \
\ \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632446\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7354260089686099,\n\
\ \"acc_stderr\": 0.029605103217038325,\n \"acc_norm\": 0.7354260089686099,\n\
\ \"acc_norm_stderr\": 0.029605103217038325\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5535714285714286,\n\
\ \"acc_stderr\": 0.04718471485219587,\n \"acc_norm\": 0.5535714285714286,\n\
\ \"acc_norm_stderr\": 0.04718471485219587\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.02353292543104429,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.02353292543104429\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.46145251396648046,\n\
\ \"acc_stderr\": 0.016672731267552258,\n \"acc_norm\": 0.46145251396648046,\n\
\ \"acc_norm_stderr\": 0.016672731267552258\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729494,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729494\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.02623696588115327,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.02623696588115327\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n\
\ \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n\
\ \"acc_stderr\": 0.012739711554045699,\n \"acc_norm\": 0.4654498044328553,\n\
\ \"acc_norm_stderr\": 0.012739711554045699\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6985294117647058,\n \"acc_stderr\": 0.027875982114273168,\n\
\ \"acc_norm\": 0.6985294117647058,\n \"acc_norm_stderr\": 0.027875982114273168\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6895424836601307,\n \"acc_stderr\": 0.01871806705262323,\n \
\ \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.01871806705262323\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505416,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505416\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.025607375986579157,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.025607375986579157\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n\
\ \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n\
\ \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3806609547123623,\n\
\ \"mc1_stderr\": 0.016997627871907922,\n \"mc2\": 0.5521420727155698,\n\
\ \"mc2_stderr\": 0.015524240307962742\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7363851617995264,\n \"acc_stderr\": 0.012382849299658464\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6648976497346475,\n \
\ \"acc_stderr\": 0.013001948176422948\n }\n}\n```"
repo_url: https://huggingface.co/Azure99/blossom-v4-qwen1_5-14b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|arc:challenge|25_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|gsm8k|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hellaswag|10_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T16-10-54.536320.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-19T16-10-54.536320.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- '**/details_harness|winogrande|5_2024-02-19T16-10-54.536320.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-19T16-10-54.536320.parquet'
- config_name: results
data_files:
- split: 2024_02_19T16_10_54.536320
path:
- results_2024-02-19T16-10-54.536320.parquet
- split: latest
path:
- results_2024-02-19T16-10-54.536320.parquet
---
# Dataset Card for Evaluation run of Azure99/blossom-v4-qwen1_5-14b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Azure99/blossom-v4-qwen1_5-14b](https://huggingface.co/Azure99/blossom-v4-qwen1_5-14b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Azure99__blossom-v4-qwen1_5-14b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-19T16:10:54.536320](https://huggingface.co/datasets/open-llm-leaderboard/details_Azure99__blossom-v4-qwen1_5-14b/blob/main/results_2024-02-19T16-10-54.536320.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6761781501776473,
"acc_stderr": 0.03171302441468721,
"acc_norm": 0.6794416041533885,
"acc_norm_stderr": 0.03234055287965887,
"mc1": 0.3806609547123623,
"mc1_stderr": 0.016997627871907922,
"mc2": 0.5521420727155698,
"mc2_stderr": 0.015524240307962742
},
"harness|arc:challenge|25": {
"acc": 0.5315699658703071,
"acc_stderr": 0.014582236460866977,
"acc_norm": 0.5733788395904437,
"acc_norm_stderr": 0.014453185592920293
},
"harness|hellaswag|10": {
"acc": 0.5987851025692094,
"acc_stderr": 0.004891426533390628,
"acc_norm": 0.7984465245966939,
"acc_norm_stderr": 0.004003405481372166
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145631,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145631
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.756578947368421,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.756578947368421,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7396226415094339,
"acc_stderr": 0.027008766090708052,
"acc_norm": 0.7396226415094339,
"acc_norm_stderr": 0.027008766090708052
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6936170212765957,
"acc_stderr": 0.030135906478517563,
"acc_norm": 0.6936170212765957,
"acc_norm_stderr": 0.030135906478517563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5350877192982456,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.5350877192982456,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7034482758620689,
"acc_stderr": 0.03806142687309992,
"acc_norm": 0.7034482758620689,
"acc_norm_stderr": 0.03806142687309992
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5582010582010583,
"acc_stderr": 0.025576257061253833,
"acc_norm": 0.5582010582010583,
"acc_norm_stderr": 0.025576257061253833
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8225806451612904,
"acc_stderr": 0.021732540689329283,
"acc_norm": 0.8225806451612904,
"acc_norm_stderr": 0.021732540689329283
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.034819048444388045,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.034819048444388045
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8545454545454545,
"acc_stderr": 0.02753019635506658,
"acc_norm": 0.8545454545454545,
"acc_norm_stderr": 0.02753019635506658
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8535353535353535,
"acc_stderr": 0.025190921114603915,
"acc_norm": 0.8535353535353535,
"acc_norm_stderr": 0.025190921114603915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758723,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6897435897435897,
"acc_stderr": 0.023454674889404288,
"acc_norm": 0.6897435897435897,
"acc_norm_stderr": 0.023454674889404288
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.029958249250082107,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.029958249250082107
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7521008403361344,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.7521008403361344,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.46357615894039733,
"acc_stderr": 0.04071636065944216,
"acc_norm": 0.46357615894039733,
"acc_norm_stderr": 0.04071636065944216
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660834,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660834
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.03376922151252335,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.03376922151252335
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654383,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654383
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8481012658227848,
"acc_stderr": 0.023363878096632446,
"acc_norm": 0.8481012658227848,
"acc_norm_stderr": 0.023363878096632446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7354260089686099,
"acc_stderr": 0.029605103217038325,
"acc_norm": 0.7354260089686099,
"acc_norm_stderr": 0.029605103217038325
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5535714285714286,
"acc_stderr": 0.04718471485219587,
"acc_norm": 0.5535714285714286,
"acc_norm_stderr": 0.04718471485219587
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.02353292543104429,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.02353292543104429
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.46145251396648046,
"acc_stderr": 0.016672731267552258,
"acc_norm": 0.46145251396648046,
"acc_norm_stderr": 0.016672731267552258
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.025261691219729494,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.025261691219729494
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.02623696588115327,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.02623696588115327
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.012739711554045699,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.012739711554045699
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6985294117647058,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.6985294117647058,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.01871806705262323,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.01871806705262323
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.04582004841505416,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.04582004841505416
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8,
"acc_stderr": 0.025607375986579157,
"acc_norm": 0.8,
"acc_norm_stderr": 0.025607375986579157
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8009950248756219,
"acc_stderr": 0.028231365092758406,
"acc_norm": 0.8009950248756219,
"acc_norm_stderr": 0.028231365092758406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896308,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896308
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3806609547123623,
"mc1_stderr": 0.016997627871907922,
"mc2": 0.5521420727155698,
"mc2_stderr": 0.015524240307962742
},
"harness|winogrande|5": {
"acc": 0.7363851617995264,
"acc_stderr": 0.012382849299658464
},
"harness|gsm8k|5": {
"acc": 0.6648976497346475,
"acc_stderr": 0.013001948176422948
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
introspector/llama.cpp-0002 | ---
license: mit
---
|
ZurabDz/tokenized_large_corpus_v2 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: special_tokens_mask
sequence: int8
splits:
- name: train
num_bytes: 6701093568
num_examples: 14442012
download_size: 2431678404
dataset_size: 6701093568
---
# Dataset Card for "tokenized_large_corpus_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
suolyer/pile_hackernews | ---
license: apache-2.0
---
|
atmansingh/medalpaca | ---
dataset_info:
features:
- name: output
dtype: string
- name: input
dtype: string
- name: instruction
dtype: string
- name: input_ids
struct:
- name: attention_mask
sequence: int64
- name: input_ids
sequence: int64
- name: labels
struct:
- name: attention_mask
sequence: int64
- name: input_ids
sequence: int64
splits:
- name: train
num_bytes: 26299548
num_examples: 2208
download_size: 5311532
dataset_size: 26299548
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dmayhem93/self-critiquing-critique-and-refine | ---
dataset_info:
features:
- name: id
dtype: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: time
dtype: float64
- name: labeler
dtype: string
- name: is_topic_based_summarization
dtype: bool
- name: category
dtype: string
- name: severity
dtype: int64
- name: text_quotes
list:
- name: begin
dtype: int64
- name: end
dtype: int64
- name: response_quotes
list:
- name: begin
dtype: int64
- name: end
dtype: int64
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 170238231
num_examples: 34069
- name: test
num_bytes: 26100872
num_examples: 5119
download_size: 27410564
dataset_size: 196339103
---
# Dataset Card for "self-critiquing-critique-and-refine"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Evaloric__Evaloric-1.1B-test | ---
pretty_name: Evaluation run of Evaloric/Evaloric-1.1B-test
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Evaloric/Evaloric-1.1B-test](https://huggingface.co/Evaloric/Evaloric-1.1B-test)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Evaloric__Evaloric-1.1B-test\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-02T18:28:31.760304](https://huggingface.co/datasets/open-llm-leaderboard/details_Evaloric__Evaloric-1.1B-test/blob/main/results_2024-04-02T18-28-31.760304.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.269686383309535,\n\
\ \"acc_stderr\": 0.03115064860656723,\n \"acc_norm\": 0.2688981642176216,\n\
\ \"acc_norm_stderr\": 0.03188600074775313,\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.015000674373570349,\n \"mc2\": 0.382771034671839,\n\
\ \"mc2_stderr\": 0.014837380256199574\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3515358361774744,\n \"acc_stderr\": 0.013952413699600938,\n\
\ \"acc_norm\": 0.3660409556313993,\n \"acc_norm_stderr\": 0.014077223108470137\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.46395140410276836,\n\
\ \"acc_stderr\": 0.004976796060456436,\n \"acc_norm\": 0.6097390957976498,\n\
\ \"acc_norm_stderr\": 0.004868117598481947\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2814814814814815,\n\
\ \"acc_stderr\": 0.03885004245800254,\n \"acc_norm\": 0.2814814814814815,\n\
\ \"acc_norm_stderr\": 0.03885004245800254\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.03110318238312338,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.03110318238312338\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.32,\n\
\ \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2641509433962264,\n \"acc_stderr\": 0.027134291628741727,\n\
\ \"acc_norm\": 0.2641509433962264,\n \"acc_norm_stderr\": 0.027134291628741727\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n\
\ \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n\
\ \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.24680851063829787,\n \"acc_stderr\": 0.028185441301234113,\n\
\ \"acc_norm\": 0.24680851063829787,\n \"acc_norm_stderr\": 0.028185441301234113\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.0433913832257986,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.0433913832257986\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n\
\ \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23544973544973544,\n \"acc_stderr\": 0.021851509822031726,\n \"\
acc_norm\": 0.23544973544973544,\n \"acc_norm_stderr\": 0.021851509822031726\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.042163702135578345,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.042163702135578345\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.14,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.14,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.22580645161290322,\n \"acc_stderr\": 0.023785577884181012,\n \"\
acc_norm\": 0.22580645161290322,\n \"acc_norm_stderr\": 0.023785577884181012\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2019704433497537,\n \"acc_stderr\": 0.028247350122180277,\n \"\
acc_norm\": 0.2019704433497537,\n \"acc_norm_stderr\": 0.028247350122180277\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\"\
: 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.03287666758603488,\n\
\ \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.03287666758603488\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.22727272727272727,\n \"acc_stderr\": 0.029857515673386407,\n \"\
acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.029857515673386407\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.23834196891191708,\n \"acc_stderr\": 0.030748905363909892,\n\
\ \"acc_norm\": 0.23834196891191708,\n \"acc_norm_stderr\": 0.030748905363909892\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.258974358974359,\n \"acc_stderr\": 0.02221110681006166,\n \
\ \"acc_norm\": 0.258974358974359,\n \"acc_norm_stderr\": 0.02221110681006166\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.21481481481481482,\n \"acc_stderr\": 0.025040443877000683,\n \
\ \"acc_norm\": 0.21481481481481482,\n \"acc_norm_stderr\": 0.025040443877000683\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.25210084033613445,\n \"acc_stderr\": 0.028205545033277726,\n\
\ \"acc_norm\": 0.25210084033613445,\n \"acc_norm_stderr\": 0.028205545033277726\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23302752293577983,\n \"acc_stderr\": 0.018125669180861493,\n \"\
acc_norm\": 0.23302752293577983,\n \"acc_norm_stderr\": 0.018125669180861493\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3611111111111111,\n \"acc_stderr\": 0.032757734861009996,\n \"\
acc_norm\": 0.3611111111111111,\n \"acc_norm_stderr\": 0.032757734861009996\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.30392156862745096,\n \"acc_stderr\": 0.03228210387037892,\n \"\
acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.03228210387037892\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2911392405063291,\n \"acc_stderr\": 0.02957160106575337,\n \
\ \"acc_norm\": 0.2911392405063291,\n \"acc_norm_stderr\": 0.02957160106575337\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.32286995515695066,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.32286995515695066,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.037276735755969174,\n\
\ \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.037276735755969174\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n\
\ \"acc_stderr\": 0.04284467968052191,\n \"acc_norm\": 0.26851851851851855,\n\
\ \"acc_norm_stderr\": 0.04284467968052191\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.27607361963190186,\n \"acc_stderr\": 0.0351238528370505,\n\
\ \"acc_norm\": 0.27607361963190186,\n \"acc_norm_stderr\": 0.0351238528370505\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.20388349514563106,\n \"acc_stderr\": 0.03989139859531773,\n\
\ \"acc_norm\": 0.20388349514563106,\n \"acc_norm_stderr\": 0.03989139859531773\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.32051282051282054,\n\
\ \"acc_stderr\": 0.03057281131029961,\n \"acc_norm\": 0.32051282051282054,\n\
\ \"acc_norm_stderr\": 0.03057281131029961\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.29757343550446996,\n\
\ \"acc_stderr\": 0.01634911191290944,\n \"acc_norm\": 0.29757343550446996,\n\
\ \"acc_norm_stderr\": 0.01634911191290944\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2630057803468208,\n \"acc_stderr\": 0.023703099525258172,\n\
\ \"acc_norm\": 0.2630057803468208,\n \"acc_norm_stderr\": 0.023703099525258172\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.024739981355113596,\n\
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.024739981355113596\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2733118971061093,\n\
\ \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.2733118971061093,\n\
\ \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.02540719779889016,\n\
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.02540719779889016\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25177304964539005,\n \"acc_stderr\": 0.025892151156709405,\n \
\ \"acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.025892151156709405\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.242503259452412,\n\
\ \"acc_stderr\": 0.010946570966348787,\n \"acc_norm\": 0.242503259452412,\n\
\ \"acc_norm_stderr\": 0.010946570966348787\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4264705882352941,\n \"acc_stderr\": 0.030042615832714854,\n\
\ \"acc_norm\": 0.4264705882352941,\n \"acc_norm_stderr\": 0.030042615832714854\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24509803921568626,\n \"acc_stderr\": 0.01740181671142766,\n \
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.01740181671142766\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2818181818181818,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.2818181818181818,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.19183673469387755,\n \"acc_stderr\": 0.02520696315422539,\n\
\ \"acc_norm\": 0.19183673469387755,\n \"acc_norm_stderr\": 0.02520696315422539\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22388059701492538,\n\
\ \"acc_stderr\": 0.029475250236017193,\n \"acc_norm\": 0.22388059701492538,\n\
\ \"acc_norm_stderr\": 0.029475250236017193\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2469879518072289,\n\
\ \"acc_stderr\": 0.03357351982064536,\n \"acc_norm\": 0.2469879518072289,\n\
\ \"acc_norm_stderr\": 0.03357351982064536\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3391812865497076,\n \"acc_stderr\": 0.036310534964889056,\n\
\ \"acc_norm\": 0.3391812865497076,\n \"acc_norm_stderr\": 0.036310534964889056\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.015000674373570349,\n \"mc2\": 0.382771034671839,\n\
\ \"mc2_stderr\": 0.014837380256199574\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6961325966850829,\n \"acc_stderr\": 0.012926209475483591\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.050037907505686124,\n \
\ \"acc_stderr\": 0.00600544235457773\n }\n}\n```"
repo_url: https://huggingface.co/Evaloric/Evaloric-1.1B-test
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|arc:challenge|25_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|gsm8k|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hellaswag|10_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T18-28-31.760304.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T18-28-31.760304.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- '**/details_harness|winogrande|5_2024-04-02T18-28-31.760304.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-02T18-28-31.760304.parquet'
- config_name: results
data_files:
- split: 2024_04_02T18_28_31.760304
path:
- results_2024-04-02T18-28-31.760304.parquet
- split: latest
path:
- results_2024-04-02T18-28-31.760304.parquet
---
# Dataset Card for Evaluation run of Evaloric/Evaloric-1.1B-test
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Evaloric/Evaloric-1.1B-test](https://huggingface.co/Evaloric/Evaloric-1.1B-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Evaloric__Evaloric-1.1B-test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-02T18:28:31.760304](https://huggingface.co/datasets/open-llm-leaderboard/details_Evaloric__Evaloric-1.1B-test/blob/main/results_2024-04-02T18-28-31.760304.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.269686383309535,
"acc_stderr": 0.03115064860656723,
"acc_norm": 0.2688981642176216,
"acc_norm_stderr": 0.03188600074775313,
"mc1": 0.2423500611995104,
"mc1_stderr": 0.015000674373570349,
"mc2": 0.382771034671839,
"mc2_stderr": 0.014837380256199574
},
"harness|arc:challenge|25": {
"acc": 0.3515358361774744,
"acc_stderr": 0.013952413699600938,
"acc_norm": 0.3660409556313993,
"acc_norm_stderr": 0.014077223108470137
},
"harness|hellaswag|10": {
"acc": 0.46395140410276836,
"acc_stderr": 0.004976796060456436,
"acc_norm": 0.6097390957976498,
"acc_norm_stderr": 0.004868117598481947
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.03885004245800254,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.03885004245800254
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.03110318238312338,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.03110318238312338
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2641509433962264,
"acc_stderr": 0.027134291628741727,
"acc_norm": 0.2641509433962264,
"acc_norm_stderr": 0.027134291628741727
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.24680851063829787,
"acc_stderr": 0.028185441301234113,
"acc_norm": 0.24680851063829787,
"acc_norm_stderr": 0.028185441301234113
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.0433913832257986,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.0433913832257986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23544973544973544,
"acc_stderr": 0.021851509822031726,
"acc_norm": 0.23544973544973544,
"acc_norm_stderr": 0.021851509822031726
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.042163702135578345,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.042163702135578345
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.14,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.14,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22580645161290322,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.22580645161290322,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2019704433497537,
"acc_stderr": 0.028247350122180277,
"acc_norm": 0.2019704433497537,
"acc_norm_stderr": 0.028247350122180277
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.03287666758603488,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.03287666758603488
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.029857515673386407,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.029857515673386407
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23834196891191708,
"acc_stderr": 0.030748905363909892,
"acc_norm": 0.23834196891191708,
"acc_norm_stderr": 0.030748905363909892
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.258974358974359,
"acc_stderr": 0.02221110681006166,
"acc_norm": 0.258974358974359,
"acc_norm_stderr": 0.02221110681006166
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.21481481481481482,
"acc_stderr": 0.025040443877000683,
"acc_norm": 0.21481481481481482,
"acc_norm_stderr": 0.025040443877000683
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.25210084033613445,
"acc_stderr": 0.028205545033277726,
"acc_norm": 0.25210084033613445,
"acc_norm_stderr": 0.028205545033277726
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23302752293577983,
"acc_stderr": 0.018125669180861493,
"acc_norm": 0.23302752293577983,
"acc_norm_stderr": 0.018125669180861493
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.032757734861009996,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.032757734861009996
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.03228210387037892,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.03228210387037892
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2911392405063291,
"acc_stderr": 0.02957160106575337,
"acc_norm": 0.2911392405063291,
"acc_norm_stderr": 0.02957160106575337
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.32286995515695066,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.32286995515695066,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.037276735755969174,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.037276735755969174
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052191,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052191
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.27607361963190186,
"acc_stderr": 0.0351238528370505,
"acc_norm": 0.27607361963190186,
"acc_norm_stderr": 0.0351238528370505
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.20388349514563106,
"acc_stderr": 0.03989139859531773,
"acc_norm": 0.20388349514563106,
"acc_norm_stderr": 0.03989139859531773
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.32051282051282054,
"acc_stderr": 0.03057281131029961,
"acc_norm": 0.32051282051282054,
"acc_norm_stderr": 0.03057281131029961
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.29757343550446996,
"acc_stderr": 0.01634911191290944,
"acc_norm": 0.29757343550446996,
"acc_norm_stderr": 0.01634911191290944
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2630057803468208,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.2630057803468208,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.024739981355113596,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.024739981355113596
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2733118971061093,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.2733118971061093,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.02540719779889016,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.02540719779889016
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25177304964539005,
"acc_stderr": 0.025892151156709405,
"acc_norm": 0.25177304964539005,
"acc_norm_stderr": 0.025892151156709405
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.242503259452412,
"acc_stderr": 0.010946570966348787,
"acc_norm": 0.242503259452412,
"acc_norm_stderr": 0.010946570966348787
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4264705882352941,
"acc_stderr": 0.030042615832714854,
"acc_norm": 0.4264705882352941,
"acc_norm_stderr": 0.030042615832714854
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.01740181671142766,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.01740181671142766
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2818181818181818,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.2818181818181818,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19183673469387755,
"acc_stderr": 0.02520696315422539,
"acc_norm": 0.19183673469387755,
"acc_norm_stderr": 0.02520696315422539
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22388059701492538,
"acc_stderr": 0.029475250236017193,
"acc_norm": 0.22388059701492538,
"acc_norm_stderr": 0.029475250236017193
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2469879518072289,
"acc_stderr": 0.03357351982064536,
"acc_norm": 0.2469879518072289,
"acc_norm_stderr": 0.03357351982064536
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3391812865497076,
"acc_stderr": 0.036310534964889056,
"acc_norm": 0.3391812865497076,
"acc_norm_stderr": 0.036310534964889056
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2423500611995104,
"mc1_stderr": 0.015000674373570349,
"mc2": 0.382771034671839,
"mc2_stderr": 0.014837380256199574
},
"harness|winogrande|5": {
"acc": 0.6961325966850829,
"acc_stderr": 0.012926209475483591
},
"harness|gsm8k|5": {
"acc": 0.050037907505686124,
"acc_stderr": 0.00600544235457773
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
SEACrowd/id_am2ico | ---
tags:
- concept-alignment-classification
language:
- ind
- eng
---
# id_am2ico
In this work, we present AM2iCo, a wide-coverage and carefully designed cross-lingual and multilingual evaluation set;
it aims to assess the ability of state-of-the-art representation models to reason over cross-lingual
lexical-level concept alignment in context for 14 language pairs.
This dataset only contain Indonesian - English language pair.
## Dataset Usage
Run `pip install nusacrowd` before loading the dataset through HuggingFace's `load_dataset`.
## Citation
```
@inproceedings{liu-etal-2021-am2ico,
title = "{AM}2i{C}o: Evaluating Word Meaning in Context across Low-Resource Languages with Adversarial Examples",
author = "Liu, Qianchu and
Ponti, Edoardo Maria and
McCarthy, Diana and
Vuli{'c}, Ivan and
Korhonen, Anna",
booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing",
month = nov,
year = "2021",
address = "Online and Punta Cana, Dominican Republic",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.emnlp-main.571",
doi = "10.18653/v1/2021.emnlp-main.571",
pages = "7151--7162",
abstract = "Capturing word meaning in context and distinguishing between correspondences and variations across languages is key to building successful multilingual and cross-lingual text representation models. However, existing multilingual evaluation datasets that evaluate lexical semantics {``}in-context{''} have various limitations. In particular, 1) their language coverage is restricted to high-resource languages and skewed in favor of only a few language families and areas, 2) a design that makes the task solvable via superficial cues, which results in artificially inflated (and sometimes super-human) performances of pretrained encoders, and 3) no support for cross-lingual evaluation. In order to address these gaps, we present AM2iCo (Adversarial and Multilingual Meaning in Context), a wide-coverage cross-lingual and multilingual evaluation set; it aims to faithfully assess the ability of state-of-the-art (SotA) representation models to understand the identity of word meaning in cross-lingual contexts for 14 language pairs. We conduct a series of experiments in a wide range of setups and demonstrate the challenging nature of AM2iCo. The results reveal that current SotA pretrained encoders substantially lag behind human performance, and the largest gaps are observed for low-resource languages and languages dissimilar to English.",
}
```
## License
CC-BY 4.0
## Homepage
[https://github.com/cambridgeltl/AM2iCo](https://github.com/cambridgeltl/AM2iCo)
### NusaCatalogue
For easy indexing and metadata: [https://indonlp.github.io/nusa-catalogue](https://indonlp.github.io/nusa-catalogue) |
andersonbcdefg/llm-topics | ---
dataset_info:
features:
- name: query
dtype: string
- name: topic
dtype: string
splits:
- name: train
num_bytes: 64200310
num_examples: 99978
download_size: 40311342
dataset_size: 64200310
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_Corianas__Quokka_1.3b | ---
pretty_name: Evaluation run of Corianas/Quokka_1.3b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Corianas/Quokka_1.3b](https://huggingface.co/Corianas/Quokka_1.3b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Corianas__Quokka_1.3b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T07:19:08.613938](https://huggingface.co/datasets/open-llm-leaderboard/details_Corianas__Quokka_1.3b/blob/main/results_2023-09-23T07-19-08.613938.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0018875838926174498,\n\
\ \"em_stderr\": 0.0004445109990558716,\n \"f1\": 0.04535549496644304,\n\
\ \"f1_stderr\": 0.00121193350790111,\n \"acc\": 0.26361483820047354,\n\
\ \"acc_stderr\": 0.007015815814913848\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0018875838926174498,\n \"em_stderr\": 0.0004445109990558716,\n\
\ \"f1\": 0.04535549496644304,\n \"f1_stderr\": 0.00121193350790111\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5272296764009471,\n\
\ \"acc_stderr\": 0.014031631629827696\n }\n}\n```"
repo_url: https://huggingface.co/Corianas/Quokka_1.3b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|arc:challenge|25_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T07_19_08.613938
path:
- '**/details_harness|drop|3_2023-09-23T07-19-08.613938.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T07-19-08.613938.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T07_19_08.613938
path:
- '**/details_harness|gsm8k|5_2023-09-23T07-19-08.613938.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T07-19-08.613938.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hellaswag|10_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:59:51.596909.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T14:59:51.596909.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T14:59:51.596909.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T07_19_08.613938
path:
- '**/details_harness|winogrande|5_2023-09-23T07-19-08.613938.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T07-19-08.613938.parquet'
- config_name: results
data_files:
- split: 2023_07_19T14_59_51.596909
path:
- results_2023-07-19T14:59:51.596909.parquet
- split: 2023_09_23T07_19_08.613938
path:
- results_2023-09-23T07-19-08.613938.parquet
- split: latest
path:
- results_2023-09-23T07-19-08.613938.parquet
---
# Dataset Card for Evaluation run of Corianas/Quokka_1.3b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Corianas/Quokka_1.3b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Corianas/Quokka_1.3b](https://huggingface.co/Corianas/Quokka_1.3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Corianas__Quokka_1.3b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T07:19:08.613938](https://huggingface.co/datasets/open-llm-leaderboard/details_Corianas__Quokka_1.3b/blob/main/results_2023-09-23T07-19-08.613938.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0018875838926174498,
"em_stderr": 0.0004445109990558716,
"f1": 0.04535549496644304,
"f1_stderr": 0.00121193350790111,
"acc": 0.26361483820047354,
"acc_stderr": 0.007015815814913848
},
"harness|drop|3": {
"em": 0.0018875838926174498,
"em_stderr": 0.0004445109990558716,
"f1": 0.04535549496644304,
"f1_stderr": 0.00121193350790111
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5272296764009471,
"acc_stderr": 0.014031631629827696
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/aoyama_midori_istheorderarabbit | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Aoyama Midori
This is the dataset of Aoyama Midori, containing 195 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 195 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 457 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 530 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 195 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 195 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 195 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 457 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 457 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 393 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 530 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 530 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
WillHeld/us_accent_cv | ---
dataset_info:
features:
- name: client_id
dtype: string
- name: path
dtype: string
- name: sentence
dtype: string
- name: up_votes
dtype: float64
- name: down_votes
dtype: float64
- name: age
dtype: string
- name: gender
dtype: string
- name: accents
dtype: string
- name: variant
dtype: 'null'
- name: locale
dtype: string
- name: segment
dtype: 'null'
- name: label
dtype: int64
- name: embed
sequence: float64
- name: pred
dtype: float64
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 8710636787.552
num_examples: 49066
download_size: 8027142130
dataset_size: 8710636787.552
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Nexdata/Italian_Children_Spontaneous_Speech_Speech_Data | ---
task_categories:
- automatic-speech-recognition
language:
- it
---
# Dataset Card for Nexdata/Italian_Children_Spontaneous_Speech_Speech_Data
## Description
The 101 Hours - Italian Child's Spontaneous Speech Data, manually screened and processed. Annotation contains transcription text, speaker identification, gender and other informantion. This dataset can be applied in speech recognition (acoustic model or language model training), caption generation, voice content moderation and other AI algorithm research.
For more details, please refer to the link: https://www.nexdata.ai/datasets/1300?source=Huggingface
# Specifications
## Format
16k Hz, 16 bit, wav, mono channel;
## Age
12 years old and younger children;
## Content category
including self-media, conversation, live, lecture, variety show;
## Language
Italian
## Annotation
annotation for the transcription text, speaker identification, gender;
## Accuracy
Word Accuracy Rate (WAR) at least 98%.
# Licensing Information
Commercial License |
CVasNLPExperiments/OxfordPets_test_google_flan_t5_xxl_mode_T_A_ns_3669 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_ViT_L_14_Attributes_ViT_L_14_text_davinci_003_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 1752684
num_examples: 3669
download_size: 287000
dataset_size: 1752684
---
# Dataset Card for "OxfordPets_test_google_flan_t5_xxl_mode_T_A_ns_3669"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/fuso_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of fuso/扶桑/扶桑 (Azur Lane)
This is the dataset of fuso/扶桑/扶桑 (Azur Lane), containing 148 images and their tags.
The core tags of this character are `animal_ears, breasts, black_hair, long_hair, blue_eyes, large_breasts, cat_ears, animal_ear_fluff, hair_ornament, butterfly_hair_ornament, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 148 | 206.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fuso_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 148 | 122.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fuso_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 355 | 253.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fuso_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 148 | 185.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fuso_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 355 | 355.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fuso_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/fuso_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, black_kimono, cleavage, looking_at_viewer, solo, simple_background, smile, white_background, upper_body, off_shoulder, bare_shoulders, blush, butterfly |
| 1 | 10 |  |  |  |  |  | 1girl, brown_sweater, cleavage, collarbone, looking_at_viewer, solo, off-shoulder_sweater, blush, long_sleeves, smile, bare_shoulders, closed_mouth, aran_sweater, simple_background, swept_bangs, white_background, shiny, sitting, sweater_dress |
| 2 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, solo, china_dress, cleavage, smile, hair_flower, oil-paper_umbrella, white_dress, open_mouth, holding_umbrella, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_kimono | cleavage | looking_at_viewer | solo | simple_background | smile | white_background | upper_body | off_shoulder | bare_shoulders | blush | butterfly | brown_sweater | collarbone | off-shoulder_sweater | long_sleeves | closed_mouth | aran_sweater | swept_bangs | shiny | sitting | sweater_dress | china_dress | hair_flower | oil-paper_umbrella | white_dress | open_mouth | holding_umbrella |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-----------|:--------------------|:-------|:--------------------|:--------|:-------------------|:-------------|:---------------|:-----------------|:--------|:------------|:----------------|:-------------|:-----------------------|:---------------|:---------------|:---------------|:--------------|:--------|:----------|:----------------|:--------------|:--------------|:---------------------|:--------------|:-------------|:-------------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | | X | X | X | X | X | X | | | X | X | | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 2 | 10 |  |  |  |  |  | X | | X | X | X | | X | X | | | | | | | | | | | | | | | | X | X | X | X | X | X |
|
ahdsoft/multiple_choice_without_context | ---
license: mit
---
|
BroUP/No | ---
license: mit
---
|
luci/questions | ---
license: wtfpl
language:
- fr
---
### Présentation Générale:
Ce dataset est une collection de questions et réponses en français, principalement axée sur les sujets techniques tels que le développement, le DevOps, la sécurité, les données, l'apprentissage automatique, et bien d'autres domaines liés à la technologie.
### Structure du Dataset:
Chaque élément du dataset est un objet avec les champs suivants:
- `id`: Un identifiant unique pour chaque entrée.
- `category`: La catégorie ou le domaine de la question (par exemple, "linux").
- `question`: La question posée.
- `answer`: La réponse à la question.
### Exemple d'une entrée:
```json
{
"id": "7d0b4a78-371e-4bd9-a292-c97ce162e740",
"category": "linux",
"question": "Qu'est-ce que Linux ?",
"answer": "Linux est un système d'exploitation libre et open source basé sur le noyau Linux..."
}
```
### Utilisation:
Ce dataset peut être utilisé pour :
1. **Entraîner des modèles de langage** : En utilisant les questions et les réponses comme données d'entraînement.
2. **Évaluer des modèles de langage** : En comparant les réponses générées par le modèle avec les réponses du dataset.
3. **Nettoyer et améliorer d'autres datasets** : En utilisant les questions comme base pour filtrer et nettoyer d'autres ensembles de données.
### Note Importante:
Il est important de noter que les réponses présentes dans ce dataset n'ont pas été supervisées pendant leur génération, que ce soit par des humains ou des machines. Par conséquent, il est possible que certaines réponses ne soient pas tout à fait exactes. Une vérification et une validation supplémentaires peuvent être nécessaires avant d'utiliser ces données pour des applications critiques.
### Objectif:
L'objectif principal est d'obtenir, au fil du temps, des données propres et pertinentes à partir de questions pertinentes dans le domaine technologique.
### Demo web:
http://loop.brain.fr/qwe/ |
autoevaluate/autoeval-staging-eval-launch__gov_report-plain_text-cd8e90-16116210 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- launch/gov_report
eval_info:
task: summarization
model: Blaise-g/longt5_tglobal_large_sumpubmed
metrics: ['bertscore']
dataset_name: launch/gov_report
dataset_config: plain_text
dataset_split: validation
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: Blaise-g/longt5_tglobal_large_sumpubmed
* Dataset: launch/gov_report
* Config: plain_text
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@nonchalant-nagavalli](https://huggingface.co/nonchalant-nagavalli) for evaluating this model. |
Azzedde/oai_log_summary | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2244465
num_examples: 229
download_size: 476122
dataset_size: 2244465
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
IDEA-CCNL/PretrainCorpusDemo | ---
license: apache-2.0
---
Only use for Demo
# PretrainCorpusDemo
## 引用 Citation
如果您在您的工作中使用了我们的模型,可以引用我们的[论文](https://arxiv.org/abs/2209.02970):
If you are using the resource for your work, please cite the our [paper](https://arxiv.org/abs/2209.02970):
```text
@article{fengshenbang,
author = {Jiaxing Zhang and Ruyi Gan and Junjie Wang and Yuxiang Zhang and Lin Zhang and Ping Yang and Xinyu Gao and Ziwei Wu and Xiaoqun Dong and Junqing He and Jianheng Zhuo and Qi Yang and Yongfeng Huang and Xiayu Li and Yanghan Wu and Junyu Lu and Xinyu Zhu and Weifeng Chen and Ting Han and Kunhao Pan and Rui Wang and Hao Wang and Xiaojun Wu and Zhongshen Zeng and Chongpei Chen},
title = {Fengshenbang 1.0: Being the Foundation of Chinese Cognitive Intelligence},
journal = {CoRR},
volume = {abs/2209.02970},
year = {2022}
}
```
也可以引用我们的[网站](https://github.com/IDEA-CCNL/Fengshenbang-LM/):
You can also cite our [website](https://github.com/IDEA-CCNL/Fengshenbang-LM/):
```text
@misc{Fengshenbang-LM,
title={Fengshenbang-LM},
author={IDEA-CCNL},
year={2021},
howpublished={\url{https://github.com/IDEA-CCNL/Fengshenbang-LM}},
} |
distilled-one-sec-cv12-each-chunk-uniq/chunk_268 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1083021356.0
num_examples: 211033
download_size: 1108065988
dataset_size: 1083021356.0
---
# Dataset Card for "chunk_268"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Respair/Oscar_Persian_Cleaned | ---
language:
- fa
short_description:
- This is a portion (roughly 1/6) of the original OSCAR 2301 dataset. it's been further cleaned and normalized.
---
|
susnato/PR_comments | ---
dataset_info:
features:
- name: repo_name
dtype: string
- name: pr_number
dtype: int64
- name: pr_title
dtype: string
- name: pr_description
dtype: string
- name: author
dtype: 'null'
- name: date_created
dtype: string
- name: date_merged
dtype: string
- name: filepath
dtype: string
- name: before_content
dtype: string
- name: after_content
dtype: string
- name: pr_author
dtype: string
- name: previous_commit
dtype: string
- name: pr_commit
dtype: string
- name: comment
dtype: string
- name: comment_author
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 879064359
num_examples: 12646
download_size: 117418842
dataset_size: 879064359
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-train-v2-28500 | ---
dataset_info:
features:
- name: tables
sequence: string
- name: table_names
sequence: string
- name: query
dtype: string
- name: answer
dtype: string
- name: source
dtype: string
- name: target
dtype: string
- name: source_latex
dtype: string
- name: target_latex
dtype: string
- name: source_html
dtype: string
- name: target_html
dtype: string
- name: source_markdown
dtype: string
- name: target_markdown
dtype: string
splits:
- name: train
num_bytes: 6938739142
num_examples: 1000
download_size: 1353306987
dataset_size: 6938739142
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_Gille__StrangeMerges_23-7B-slerp | ---
pretty_name: Evaluation run of Gille/StrangeMerges_23-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Gille/StrangeMerges_23-7B-slerp](https://huggingface.co/Gille/StrangeMerges_23-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gille__StrangeMerges_23-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-13T04:04:45.787844](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_23-7B-slerp/blob/main/results_2024-02-13T04-04-45.787844.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.654991806635409,\n\
\ \"acc_stderr\": 0.03201532056571857,\n \"acc_norm\": 0.654281003985643,\n\
\ \"acc_norm_stderr\": 0.03268521672767465,\n \"mc1\": 0.605875152998776,\n\
\ \"mc1_stderr\": 0.017106588140700325,\n \"mc2\": 0.7513000721365315,\n\
\ \"mc2_stderr\": 0.014241360525807377\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.712457337883959,\n \"acc_stderr\": 0.013226719056266127,\n\
\ \"acc_norm\": 0.735494880546075,\n \"acc_norm_stderr\": 0.012889272949313368\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7181836287592113,\n\
\ \"acc_stderr\": 0.004489648865080876,\n \"acc_norm\": 0.8889663413662617,\n\
\ \"acc_norm_stderr\": 0.0031353173122281234\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438665,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438665\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"\
acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.02385479568097112,\n \
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.02385479568097112\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652456,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652456\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4569832402234637,\n\
\ \"acc_stderr\": 0.01666049858050917,\n \"acc_norm\": 0.4569832402234637,\n\
\ \"acc_norm_stderr\": 0.01666049858050917\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n\
\ \"acc_stderr\": 0.012748238397365547,\n \"acc_norm\": 0.4706649282920469,\n\
\ \"acc_norm_stderr\": 0.012748238397365547\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.605875152998776,\n\
\ \"mc1_stderr\": 0.017106588140700325,\n \"mc2\": 0.7513000721365315,\n\
\ \"mc2_stderr\": 0.014241360525807377\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8429360694554064,\n \"acc_stderr\": 0.010226303949598484\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7028051554207733,\n \
\ \"acc_stderr\": 0.012588685966624174\n }\n}\n```"
repo_url: https://huggingface.co/Gille/StrangeMerges_23-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|arc:challenge|25_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|gsm8k|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hellaswag|10_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T04-04-45.787844.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-13T04-04-45.787844.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- '**/details_harness|winogrande|5_2024-02-13T04-04-45.787844.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-13T04-04-45.787844.parquet'
- config_name: results
data_files:
- split: 2024_02_13T04_04_45.787844
path:
- results_2024-02-13T04-04-45.787844.parquet
- split: latest
path:
- results_2024-02-13T04-04-45.787844.parquet
---
# Dataset Card for Evaluation run of Gille/StrangeMerges_23-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_23-7B-slerp](https://huggingface.co/Gille/StrangeMerges_23-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Gille__StrangeMerges_23-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T04:04:45.787844](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_23-7B-slerp/blob/main/results_2024-02-13T04-04-45.787844.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.654991806635409,
"acc_stderr": 0.03201532056571857,
"acc_norm": 0.654281003985643,
"acc_norm_stderr": 0.03268521672767465,
"mc1": 0.605875152998776,
"mc1_stderr": 0.017106588140700325,
"mc2": 0.7513000721365315,
"mc2_stderr": 0.014241360525807377
},
"harness|arc:challenge|25": {
"acc": 0.712457337883959,
"acc_stderr": 0.013226719056266127,
"acc_norm": 0.735494880546075,
"acc_norm_stderr": 0.012889272949313368
},
"harness|hellaswag|10": {
"acc": 0.7181836287592113,
"acc_stderr": 0.004489648865080876,
"acc_norm": 0.8889663413662617,
"acc_norm_stderr": 0.0031353173122281234
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.040943762699967926,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.040943762699967926
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438665,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438665
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.02385479568097112,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.02385479568097112
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652456,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652456
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4569832402234637,
"acc_stderr": 0.01666049858050917,
"acc_norm": 0.4569832402234637,
"acc_norm_stderr": 0.01666049858050917
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.0256468630971379,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.0256468630971379
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.012748238397365547,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.012748238397365547
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.605875152998776,
"mc1_stderr": 0.017106588140700325,
"mc2": 0.7513000721365315,
"mc2_stderr": 0.014241360525807377
},
"harness|winogrande|5": {
"acc": 0.8429360694554064,
"acc_stderr": 0.010226303949598484
},
"harness|gsm8k|5": {
"acc": 0.7028051554207733,
"acc_stderr": 0.012588685966624174
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_OpenBuddy__openbuddy-atom-13b-v9-bf16 | ---
pretty_name: Evaluation run of OpenBuddy/openbuddy-atom-13b-v9-bf16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenBuddy/openbuddy-atom-13b-v9-bf16](https://huggingface.co/OpenBuddy/openbuddy-atom-13b-v9-bf16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-atom-13b-v9-bf16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T21:37:39.062296](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-atom-13b-v9-bf16/blob/main/results_2023-10-15T21-37-39.062296.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.057466442953020135,\n\
\ \"em_stderr\": 0.0023833905882384974,\n \"f1\": 0.11402369966442945,\n\
\ \"f1_stderr\": 0.0026622077831256583,\n \"acc\": 0.44356628547732635,\n\
\ \"acc_stderr\": 0.011184922703096678\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.057466442953020135,\n \"em_stderr\": 0.0023833905882384974,\n\
\ \"f1\": 0.11402369966442945,\n \"f1_stderr\": 0.0026622077831256583\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.15390447308567096,\n \
\ \"acc_stderr\": 0.00993979930404902\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7332280978689818,\n \"acc_stderr\": 0.012430046102144337\n\
\ }\n}\n```"
repo_url: https://huggingface.co/OpenBuddy/openbuddy-atom-13b-v9-bf16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|arc:challenge|25_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_15T21_37_39.062296
path:
- '**/details_harness|drop|3_2023-10-15T21-37-39.062296.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T21-37-39.062296.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_15T21_37_39.062296
path:
- '**/details_harness|gsm8k|5_2023-10-15T21-37-39.062296.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T21-37-39.062296.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hellaswag|10_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_15T21_37_39.062296
path:
- '**/details_harness|winogrande|5_2023-10-15T21-37-39.062296.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T21-37-39.062296.parquet'
- config_name: results
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- results_2023-08-17T18:31:32.257089.parquet
- split: 2023_10_15T21_37_39.062296
path:
- results_2023-10-15T21-37-39.062296.parquet
- split: latest
path:
- results_2023-10-15T21-37-39.062296.parquet
---
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-atom-13b-v9-bf16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OpenBuddy/openbuddy-atom-13b-v9-bf16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-atom-13b-v9-bf16](https://huggingface.co/OpenBuddy/openbuddy-atom-13b-v9-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-atom-13b-v9-bf16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T21:37:39.062296](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-atom-13b-v9-bf16/blob/main/results_2023-10-15T21-37-39.062296.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.057466442953020135,
"em_stderr": 0.0023833905882384974,
"f1": 0.11402369966442945,
"f1_stderr": 0.0026622077831256583,
"acc": 0.44356628547732635,
"acc_stderr": 0.011184922703096678
},
"harness|drop|3": {
"em": 0.057466442953020135,
"em_stderr": 0.0023833905882384974,
"f1": 0.11402369966442945,
"f1_stderr": 0.0026622077831256583
},
"harness|gsm8k|5": {
"acc": 0.15390447308567096,
"acc_stderr": 0.00993979930404902
},
"harness|winogrande|5": {
"acc": 0.7332280978689818,
"acc_stderr": 0.012430046102144337
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
yhavinga/ultrachat_dutch_chat_template_tokenized_zephyr_7b_alpha_padright_num20000_len1024 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 266480000.0
num_examples: 20000
- name: test
num_bytes: 13324000.0
num_examples: 1000
download_size: 50657761
dataset_size: 279804000.0
---
# Dataset Card for "ultrachat_dutch_chat_template_tokenized_zephyr_7b_alpha_padright_num20000_len1024"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Patil1515/filteredOrca | ---
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 2951757377.902414
num_examples: 1730646
download_size: 2019309214
dataset_size: 2951757377.902414
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/b3c19030 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 175
num_examples: 10
download_size: 1330
dataset_size: 175
---
# Dataset Card for "b3c19030"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Zalupayko/Navalny | ---
license: other
---
|
khanhlinh/EuroSat_covnext | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': AnnualCrop
'1': Forest
'2': HerbaceousVegetation
'3': Highway
'4': Industrial
'5': Pasture
'6': PermanentCrop
'7': Residential
'8': River
'9': SeaLake
splits:
- name: train
num_bytes: 88397609.0
num_examples: 27000
download_size: 91979104
dataset_size: 88397609.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
projecte-aina/CA-PT_Parallel_Corpus | ---
YAML tags: null
language:
- ca
- pt
multilinguality:
- multilingual
pretty_name: CA-PT Parallel Corpus
task_categories:
- translation
size_categories:
- 1M<n<10M
license: cc-by-nc-sa-4.0
---
# Dataset Card for CA-PT Parallel Corpus
## Dataset Description
- **Point of Contact:** langtech@bsc.es
### Dataset Summary
The CA-PT Parallel Corpus is a Catalan-Portuguese dataset of **9.892.953** parallel sentences. The dataset was created to support Catalan in NLP tasks, specifically
Machine Translation.
### Supported Tasks and Leaderboards
The dataset can be used to train Bilingual Machine Translation models between Portuguese and Catalan in any direction,
as well as Multilingual Machine Translation models.
### Languages
The sentences included in the dataset are in Catalan (CA) and Portuguese (PT).
## Dataset Structure
### Data Instances
Two separate txt files are provided with the sentences sorted in the same order:
- ca-pt_2023_09_01_full.ca: contains 9.892.953 Catalan sentences.
- ca-pt_2023_09_01_full.pt: contains 9.892.953 Portuguese sentences.
### Data Fields
[N/A]
### Data Splits
The dataset contains a single split: `train`.
## Dataset Creation
### Curation Rationale
This dataset is aimed at promoting the development of Machine Translation between Catalan and other languages, specifically Portuguese.
### Source Data
#### Initial Data Collection and Normalization
The dataset is a combination of the following original datasets:
| Dataset | Sentences |
|:-------|-------:|
| CCMatrix v1 | 3.765.459 |
| WikiMatrix | 317.649 |
| GNOME | 1.752 |
| KDE4 | 117.828 |
| QED | 43.736 |
| TED2020 v1 | 41.461 |
| OpenSubtitles | 235.604 |
| GlobalVoices | 3.430 |
| Tatoeba | 723 |
| Europarl | 1.631.989 |
| **Total** | **6.159.631** |
All corpora except Europarl were collected from [Opus](https://opus.nlpl.eu/).
The Europarl corpus is a synthetic parallel corpus created from the original Spanish-Catalan corpus by [SoftCatalà](https://github.com/Softcatala/Europarl-catalan).
The remaining **3.733.322** sentences are synthetic parallel data created from a random sampling of the Spanish-Portuguese corpora
available on [Opus](https://opus.nlpl.eu/) and translated into Catalan using the [PlanTL es-ca](https://huggingface.co/PlanTL-GOB-ES/mt-plantl-es-ca) model.
All datasets are deduplicated and filtered to remove any sentence pairs with a cosine similarity of less than 0.75.
This is done using sentence embeddings calculated using [LaBSE](https://huggingface.co/sentence-transformers/LaBSE).
The filtered datasets are then concatenated to form a final corpus of **9.892.953** parallel sentences.
#### Who are the source language producers?
[Opus](https://opus.nlpl.eu/)
[SoftCatalà](https://github.com/Softcatala/Europarl-catalan)
### Annotations
#### Annotation process
The dataset does not contain any annotations.
#### Who are the annotators?
[N/A]
### Personal and Sensitive Information
Given that this dataset is partly derived from pre-existing datasets that may contain crawled data, and that no specific anonymisation process has been applied,
personal and sensitive information may be present in the data. This needs to be considered when using the data for training models.
## Considerations for Using the Data
### Social Impact of Dataset
By providing this resource, we intend to promote the use of Catalan across NLP tasks, thereby improving the accessibility and visibility of the Catalan language.
### Discussion of Biases
No specific bias mitigation strategies were applied to this dataset.
Inherent biases may exist within the data.
### Other Known Limitations
The dataset contains data of a general domain. Applications of this dataset in more specific domains such as biomedical, legal etc. would be of limited use.
## Additional Information
### Dataset Curators
Language Technologies Unit at the Barcelona Supercomputing Center (langtech@bsc.es).
This work has been promoted and financed by the Generalitat de Catalunya through the [Aina project](https://projecteaina.cat/).
### Licensing Information
This work is licensed under a [Attribution-NonCommercial-ShareAlike 4.0 International](https://creativecommons.org/licenses/by-nc-sa/4.0/).
### Citation Information
[N/A]
### Contributions
[N/A] |
Xilixmeaty40/auto-datasets | ---
license: apache-2.0
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/1e3d19a4 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1332
dataset_size: 182
---
# Dataset Card for "1e3d19a4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Eunju2834/oil_impressionism_style | ---
task_categories:
- text-generation
size_categories:
- 10K<n<100K
license: artistic-2.0
tags:
- art
--- |
seansullivan/BIDocs2 | ---
license: other
---
|
OPTML-Group/UnlearnCanvas | ---
license: mit
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 76080381824.0
num_examples: 24400
download_size: 77334395121
dataset_size: 76080381824.0
---
# Dataset Card for UnlearnCanvas
This dataset card introduces "UnlearnCanvas", a high-resolution stylized image dataset for benchmarking generative modeling tasks, in particular for machine unlearning in diffusion models. Developed to address the societal concerns arising from diffusion models, such as harmful content generation, copyright disputes, and the perpetuation of stereotypes and biases, UnlearnCanvas aims at facilitating the evaluation and improvement of machine unlearning methods.
## Dataset Details
### Dataset Description
- **Curated by:** Yihua Zhang, Yimeng Zhang, Yuguang Yao, Jinghan Jia, Jiancheng Liu, Xiaoming Liu, Sijia Liu
- **License:** MIT
UnlearnCanvas is a comprehensive, high-resolution image dataset designed to evaluate the unlearning of artistic painting styles and associated image objects. It contains images across 60 different artistic painting styles, with 400 images for each style across 20 different object categories, making it suitable for a wide range of vision generative modeling tasks beyond machine unlearning, such as style transfer, bias removal, and more.
### Dataset Sources [optional]
- **Repository:** [UnlearnCanvas GitHub](https://github.com/OPTML-Group/UnlearnCanvas)
- **Paper:** [UnlearnCanvas Paper on arXiv](https://arxiv.org/abs/2402.11846)
- **Demo:** [HuggingFace Benchmark](https://huggingface.co/spaces/OPTML-Group/UnlearnCanvas-Benchmark)
## Uses
### Direct Use
UnlearnCanvas is intended for direct use in:
- Evaluating machine unlearning methods for diffusion models.
- Benchmarking state-of-the-art machine unlearning techniques.
- Facilitating research in style transfer, bias removal, vision in-context learning, out-of-distribution learning, and other generative modeling tasks.
### Out-of-Scope Use
- Commercial use without proper licensing or attribution may be out of scope, given the MIT license.
## Dataset Structure
The dataset consists of high-resolution images across 60 different artistic painting styles, structured as `./style_name/object_name/image_idx.jpg`, with a separate `./Seed_Image` folder for photo-realistic images. The dataset's balanced structure and high stylistic consistency make it an ideal resource for fine-tuning and evaluating diffusion models.
## Dataset Creation
### Curation Rationale
The dataset was curated to address the lack of standardized and automated evaluation frameworks for machine unlearning techniques in diffusion models, facilitating the removal of undesired generative capabilities.
### Source Data
#### Data Collection and Processing
The images were annotated (for stylization) from a set of high-resolution real-world photo-realistic images collected from the [Pexels](https://www.pexels.com/) using the services provided by [fotor](https://www.fotor.com).
#### Who are the source data producers?
The dataset was produced by a collaborative effort led by Yihua Zhang with contributions from their research team.
## Bias, Risks, and Limitations
The dataset aims to minimize societal concerns related to diffusion models but users should be aware of the potential for misuse. Researchers are encouraged to approach the dataset with an understanding of its scope and limitations, particularly concerning the representation of styles and objects.
### Recommendations
Researchers should ensure ethical use of the dataset, avoiding applications that might generate harmful content or perpetuate biases. Further studies are recommended to explore and mitigate any inherent biases within the dataset.
## Citation
**BibTeX:**
```bibtex
@article{zhang2024unlearncanvas,
title={UnlearnCanvas: A Stylized Image Dataset to Benchmark Machine Unlearning for Diffusion Models},
author={Zhang, Yihua and Zhang, Yimeng and Yao, Yuguang and Jia, Jinghan and Liu, Jiancheng and Liu, Xiaoming and Liu, Sijia},
journal={arXiv preprint arXiv:2402.11846},
year={2024}
}
|
one-sec-cv12/chunk_209 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 21532616928.75
num_examples: 224186
download_size: 20204672507
dataset_size: 21532616928.75
---
# Dataset Card for "chunk_209"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sagecontinuum/solarirradiancedataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: irradiance
dtype: float32
splits:
- name: full
num_bytes: 13466250
num_examples: 1000
download_size: 14234112
dataset_size: 13466250
tags:
- climate
license: mit
---
# Estimating Solar Irradiance with Image Regression
- **Homepage:** [Sage Continuum](https://sagecontinuum.org/)
- **Author:** Alex Shen, Northwestern University
- **Mentors:** Bhupendra Raut, Seongha Park
- **Repository:** [GitHub Repository](https://github.com/waggle-sensor/summer2023/tree/main/Shen)
# Goal and Importance
Our goal was to create a model to estimate solar irradiance in the sky based on ground images taken from waggle nodes. This could help in the following ways:
- Solar energy generation: It could help in predicting energy generation more accurately resulting in improved efficiency and grid management
- Weather forecasting- Could assist meteorologists in predicting weather patterns using solar irradiance levels, and in analyzing current weather conditions
- Climate change: Would help with modeling climate change, could contribute to understanding and assist in mitigating global warming
- Smart Homes: Would be able to help smart homes manage energy more efficiently (control certain devices based on irradiance levels)
# Data Preprocessing
In the data preprocessing stage we created a csv file that stored all the images to their matching solar irradiance values. The images were taken from the Sage Waggle Node's top camera and the solar irradiance values were taken from the Argonne National Laboratory tower readings. We made sure to exclude night time photos since there is no sun and we exclusively used summer-time photos as we wanted to stick to a seasonal model that would be able to make estimates more consistently. Furthermore we also eventually downsized the images original 2000x2000 images to 500x500 images since the training was taking a bit too long when the images were larger.

*Example training image taken from waggle node W039*, 2000x2000 pixels
# Training and Model
In our training, before the image was transformed to a tensor, the image was resized down to 224x224 to stay consistent with the pre-trained models. The image was also randomly flipped with a 50% chance and rotated randomly between 0-359 degrees so the model would be able to generalize better. For our model we compared all of the pretrained ResNet models and the VGG-16 model. However we replaced the last fc layer so that the model would give us a continuous value as an estimate instead of a range. We found that the ResNet 50 model performed the best with the lowest mean absolute error of 82. All in all, I think that the error was small enough to justify creating the plugin. In the plugin the waggle node simply snaps an image of the sky using its top camera, and notes the solar irradiance that the model predicts and publishes it to the Beehive Repository.
# Graphs

<br>
_Graph showing the # of times that each margin of error appeared in our tesing images. For example, the model predicting 10 when the irradiance is 20 would result in an error of 10, raising the first bar of the bar graph 1 occurence higher_
<br>

_This graph plots the predicted irradiance of a test image against its actual irradiance value. The dots are centering mostly around the y=x line meaning the model is predicting accurately on average. Also since there are points both above and below the line the model is not biased towards either overestimating or underestimating also causing it to predict well on average_
# Future Directions
- Increase training data to decrease MAE
- Work around identifying through the thin cloud layers since it causes mistakes in the model by severely underestimating the irradiance value due to thin clouds covering the image
- Work on identifying correct irradiance values during sunsets and sunrises. The model occasionally overestimates irradiance when the sun is at its perimeter due to greater light exposure in the image
- Implement a feature to forecast solar irradiance levels based on the patterns of data gathered
|
open-llm-leaderboard/details_paulml__NMTOB-7B | ---
pretty_name: Evaluation run of paulml/NMTOB-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [paulml/NMTOB-7B](https://huggingface.co/paulml/NMTOB-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_paulml__NMTOB-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-12T12:41:06.200570](https://huggingface.co/datasets/open-llm-leaderboard/details_paulml__NMTOB-7B/blob/main/results_2024-02-12T12-41-06.200570.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6524767492630037,\n\
\ \"acc_stderr\": 0.03200904140407624,\n \"acc_norm\": 0.6518349230917634,\n\
\ \"acc_norm_stderr\": 0.032679616576566206,\n \"mc1\": 0.6034271725826194,\n\
\ \"mc1_stderr\": 0.017124930942023515,\n \"mc2\": 0.7506354467048914,\n\
\ \"mc2_stderr\": 0.01429535038329162\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7056313993174061,\n \"acc_stderr\": 0.013318528460539419,\n\
\ \"acc_norm\": 0.7303754266211604,\n \"acc_norm_stderr\": 0.012968040686869147\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7153953395737901,\n\
\ \"acc_stderr\": 0.004503037601847085,\n \"acc_norm\": 0.8893646683927504,\n\
\ \"acc_norm_stderr\": 0.0031303894668331987\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944433,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944433\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\
\ \"acc_stderr\": 0.023664216671642514,\n \"acc_norm\": 0.7774193548387097,\n\
\ \"acc_norm_stderr\": 0.023664216671642514\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n\
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n\
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683512,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683512\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926924,\n\
\ \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926924\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993466,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993466\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508287,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508287\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4446927374301676,\n\
\ \"acc_stderr\": 0.01661988198817702,\n \"acc_norm\": 0.4446927374301676,\n\
\ \"acc_norm_stderr\": 0.01661988198817702\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035457,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035457\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000328,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000328\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6034271725826194,\n\
\ \"mc1_stderr\": 0.017124930942023515,\n \"mc2\": 0.7506354467048914,\n\
\ \"mc2_stderr\": 0.01429535038329162\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8516179952644041,\n \"acc_stderr\": 0.009990706005184138\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6899166034874905,\n \
\ \"acc_stderr\": 0.01274030571737627\n }\n}\n```"
repo_url: https://huggingface.co/paulml/NMTOB-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|arc:challenge|25_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|gsm8k|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hellaswag|10_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T12-41-06.200570.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-12T12-41-06.200570.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- '**/details_harness|winogrande|5_2024-02-12T12-41-06.200570.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-12T12-41-06.200570.parquet'
- config_name: results
data_files:
- split: 2024_02_12T12_41_06.200570
path:
- results_2024-02-12T12-41-06.200570.parquet
- split: latest
path:
- results_2024-02-12T12-41-06.200570.parquet
---
# Dataset Card for Evaluation run of paulml/NMTOB-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [paulml/NMTOB-7B](https://huggingface.co/paulml/NMTOB-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_paulml__NMTOB-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-12T12:41:06.200570](https://huggingface.co/datasets/open-llm-leaderboard/details_paulml__NMTOB-7B/blob/main/results_2024-02-12T12-41-06.200570.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6524767492630037,
"acc_stderr": 0.03200904140407624,
"acc_norm": 0.6518349230917634,
"acc_norm_stderr": 0.032679616576566206,
"mc1": 0.6034271725826194,
"mc1_stderr": 0.017124930942023515,
"mc2": 0.7506354467048914,
"mc2_stderr": 0.01429535038329162
},
"harness|arc:challenge|25": {
"acc": 0.7056313993174061,
"acc_stderr": 0.013318528460539419,
"acc_norm": 0.7303754266211604,
"acc_norm_stderr": 0.012968040686869147
},
"harness|hellaswag|10": {
"acc": 0.7153953395737901,
"acc_stderr": 0.004503037601847085,
"acc_norm": 0.8893646683927504,
"acc_norm_stderr": 0.0031303894668331987
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.040943762699967926,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.040943762699967926
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944433,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642514,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642514
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683512,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683512
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926924,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926924
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993466,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993466
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508287,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.023786203255508287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4446927374301676,
"acc_stderr": 0.01661988198817702,
"acc_norm": 0.4446927374301676,
"acc_norm_stderr": 0.01661988198817702
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035457,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000328,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000328
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6034271725826194,
"mc1_stderr": 0.017124930942023515,
"mc2": 0.7506354467048914,
"mc2_stderr": 0.01429535038329162
},
"harness|winogrande|5": {
"acc": 0.8516179952644041,
"acc_stderr": 0.009990706005184138
},
"harness|gsm8k|5": {
"acc": 0.6899166034874905,
"acc_stderr": 0.01274030571737627
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
jpcorb20/rag_epfl_guidelines | ---
license: other
license_name: common-crawl
license_link: LICENSE
dataset_info:
features:
- name: doc_id
dtype: string
- name: title
dtype: string
- name: source
dtype: string
- name: text
dtype: string
- name: paragraph_id
dtype: int64
splits:
- name: train
num_bytes: 566284252
num_examples: 710745
download_size: 234020519
dataset_size: 566284252
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- en
size_categories:
- 100K<n<1M
---
# RAG EPFL-LLM Guidelines
This is a RAG version of the open-source datasets from the [EPFL Guidelines](hf.co/epfl-llm/guidelines) chunked using the LangChain's recursive-character text splitter with a chunk size of a 1,000 characters and an overlap of 200 characters.
## Citation
TBD |
Gabef/production-samples-17 | ---
dataset_info:
features:
- name: features
sequence: float32
- name: labels
dtype: string
splits:
- name: train
num_bytes: 4142196436
num_examples: 16560
download_size: 4000299254
dataset_size: 4142196436
---
# Dataset Card for "production-samples-17"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TCMLM/TCM_Humanities | ---
license: mit
task_categories:
- text-classification
language:
- zh
tags:
- medical
- safety
size_categories:
- n<1K
---
# Dataset Card for [TCMLM/TCM_Humanities]
<!-- Provide a quick summary of the dataset. -->
This dataset, curated by the Traditional Chinese Medicine Language Model Team, comprises a comprehensive collection of multiple-choice questions (both single and multiple answers) from the Chinese Medical Practitioner Examination. It's designed to aid in understanding and assessing knowledge in Chinese humanities medicine, medical ethics, and legal regulations for physicians.
## Dataset Details
### Dataset Description
- **Curated by:** Traditional Chinese Medicine Language Model Team.
- **Funded by:** Sponsored by family parental funds.
- **Language(s) (NLP):** Primarily in Chinese.
- **License:** MIT License.
## Uses
### Direct Use
This dataset is primarily intended for academic research, educational purposes, and training models in the field of medical humanities, ethics, and law. It can be used to develop AI models that understand and interpret questions related to these fields, aiding in the preparation for medical licensing exams in China.
### Out-of-Scope Use
The dataset is not designed for clinical decision-making or patient care. It should not be used as a standalone resource for legal or ethical advice in medical practices. Commercial use and use in medical scenarios require explicit authorization from the author. Unauthorized use, and any resulting ethical, medical safety, or legal issues, are the responsibility of the user.
## Dataset Structure
### Source Data
The dataset comprises a curated selection of questions from the Chinese Medical Practitioner Examination. These questions encompass various aspects of medical ethics, legal regulations, and humanities in medicine. Each entry in the dataset includes a question number, the question text, multiple choice options, the correct answer, and an explanation for the answer.
For example:
| 题目序号 | 题干 | 选项 | 答案 | 解析 |
| ------- | ---- | ---- | ---- | ---- |
| 1 | 根据《处方管理办法》规定,处方保存期满后,经()批准、登记备案,方可销毁 | "A.医疗机构主要负责人<br>B.卫生行政主管部门医政管理科室<br>C.卫生行政主管部门负责人<br>D.药品监督管理部门" | A | 《处方管理办法》第五十条规定:处方保存期满后,经医疗机构主要负责人批准、登记备案,方可销毁。 |
## Bias, Risks, and Limitations
### Bias
- **Cultural and Regional Specificity:** This dataset is specifically derived from the Chinese Medical Practitioner Examination and hence, is deeply rooted in the context of Chinese medical practice, law, and ethics. This focus may not accurately represent the diversity of medical practices, ethical standards, and legal frameworks found in other countries and regions. As a result, the dataset may not be suitable for global generalizations about medical practices.
- **Content Limitation:** The dataset's focus on multiple-choice questions may limit the depth and complexity of understanding that can be conveyed about each topic. Real-world medical, ethical, and legal scenarios are often more nuanced than what can be captured in a standardized test format.
### Risks
- **Misinterpretation:** Users of this dataset, especially those not familiar with the Chinese medical system, might misinterpret the information due to differences in medical practices and regulations across countries. This could lead to incorrect applications of the knowledge in different medical or legal contexts.
- **Educational Use Limitation:** While the dataset can be an excellent resource for educational purposes, it should not be relied upon as the sole source of information for critical decision-making in medical practice or legal advice. Users should consult a variety of resources and professional advice for such purposes.
### Limitations
- **Question Quantity:** The dataset's utility may be limited by the number of questions it contains. A larger number of questions would provide a more comprehensive overview of the various aspects of medical humanities, ethics, and laws in China.
- **Language Barrier:** The dataset is primarily in Chinese, which may limit its accessibility to non-Chinese speaking users. This could hinder its use in international research or educational settings.
- **Commercial and Medical Scenario Use:** The dataset is not authorized for commercial use or medical scenarios without explicit permission from the author. Unauthorized use in these contexts may lead to ethical, medical safety, or legal issues.
### Ethical Considerations
- **Sensitive Content:** Some questions in the dataset might involve sensitive ethical dilemmas or legal issues. Users must approach these topics with the appropriate level of sensitivity and understanding of the cultural context.
- **Respect for Intellectual Property:** The dataset is based on questions from an official examination. Users should respect the intellectual property rights associated with the content and adhere to the provided usage guidelines.
In summary, while the "Chinese Medical Humanities Dataset" provides valuable insights into Chinese medical humanities, ethics, and law, users should be aware of its cultural specificity, content limitations, and potential risks. It is important to use this dataset responsibly, keeping in mind its limitations and the need for a broad, culturally sensitive approach to medical humanities and legal education.
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation
**BibTeX:**
@misc{TCM_Humanities,
author = {Paris Kang},
title = {Chinese Medical Humanities Dataset},
year = {2024},
howpublished = {Hugging Face Dataset Hub},
url = {https://huggingface.co/datasets/TCMLM/TCM_Humanities/}
}
ruby
Copy code
**APA:**
Kang, P. (2024). *Chinese Medical Humanities Dataset*. Retrieved from https://huggingface.co/datasets/TCMLM/TCM_Humanities/
## Dataset Card Authors
**Author:** Paris Kang, a poet, a practicing physician in oncology with a background in both traditional Chinese and Western medicine, and a doctoral candidate in Electronic Information.
**Contact Email:** 1641866831@qq.com |
presencesw/dataset_2000_decompese_question_1 | ---
dataset_info:
features:
- name: entities
sequence: 'null'
- name: triplets
list:
- name: question
dtype: string
- name: answer
dtype: string
- name: complex_question
dtype: string
splits:
- name: train
num_bytes: 67820
num_examples: 199
download_size: 26545
dataset_size: 67820
---
# Dataset Card for "dataset_2000_decompese_question_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
qmeeus/MSNER-nlp | ---
dataset_info:
- config_name: de
features:
- name: tokens
sequence: string
- name: tags
sequence: string
splits:
- name: train
num_bytes: 41616289
num_examples: 108473
- name: validation
num_bytes: 791188
num_examples: 2109
- name: test
num_bytes: 747121
num_examples: 1966
download_size: 10480059
dataset_size: 43154598
- config_name: en
features:
- name: tokens
sequence: string
- name: tags
sequence: string
splits:
- name: train
num_bytes: 2204014
num_examples: 5000
- name: validation
num_bytes: 735967
num_examples: 1753
- name: test
num_bytes: 742319
num_examples: 1842
download_size: 745400
dataset_size: 3682300
- config_name: es
features:
- name: tokens
sequence: string
- name: tags
sequence: string
splits:
- name: train
num_bytes: 25555845
num_examples: 50922
- name: validation
num_bytes: 829913
num_examples: 1631
- name: test
num_bytes: 810712
num_examples: 1512
download_size: 5770971
dataset_size: 27196470
- config_name: fr
features:
- name: tokens
sequence: string
- name: tags
sequence: string
splits:
- name: train
num_bytes: 37492920
num_examples: 73561
- name: validation
num_bytes: 895731
num_examples: 1727
- name: test
num_bytes: 816506
num_examples: 1656
download_size: 8204258
dataset_size: 39205157
- config_name: nl
features:
- name: tokens
sequence: string
- name: tags
sequence: string
splits:
- name: train
num_bytes: 7597460
num_examples: 20968
- name: validation
num_bytes: 453646
num_examples: 1230
- name: test
num_bytes: 434877
num_examples: 1120
download_size: 1947747
dataset_size: 8485983
configs:
- config_name: de
data_files:
- split: train
path: de/train-*
- split: validation
path: de/validation-*
- split: test
path: de/test-*
- config_name: en
data_files:
- split: train
path: en/train-*
- split: validation
path: en/validation-*
- split: test
path: en/test-*
- config_name: es
data_files:
- split: train
path: es/train-*
- split: validation
path: es/validation-*
- split: test
path: es/test-*
- config_name: fr
data_files:
- split: train
path: fr/train-*
- split: validation
path: fr/validation-*
- split: test
path: fr/test-*
- config_name: nl
data_files:
- split: train
path: nl/train-*
- split: validation
path: nl/validation-*
- split: test
path: nl/test-*
task_categories:
- token-classification
language:
- de
- fr
- nl
- es
- en
---
# Dataset Card for "MSNER-nlp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ferrag/sof | ---
license: apache-2.0
---
|
arieg/bw_spec_cls_80_26 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '62001'
'1': '62003'
'2': '62005'
'3': '62007'
'4': '62163'
'5': '62164'
'6': '62165'
'7': '62180'
'8': '62183'
'9': '62185'
'10': '62186'
'11': '62187'
'12': '62188'
'13': '62189'
'14': '62190'
'15': '62191'
'16': '62192'
'17': '62193'
'18': '62194'
'19': '62195'
'20': '62196'
'21': '62337'
'22': '62426'
'23': '62436'
'24': '62445'
'25': '62446'
'26': '62448'
'27': '62449'
'28': '62450'
'29': '62452'
'30': '62458'
'31': '62525'
'32': '62526'
'33': '62527'
'34': '62528'
'35': '62529'
'36': '62531'
'37': '62532'
'38': '62533'
'39': '62534'
'40': '62586'
'41': '62589'
'42': '62591'
'43': '62592'
'44': '62594'
'45': '62595'
'46': '62596'
'47': '62655'
'48': '62671'
'49': '62742'
'50': '62748'
'51': '62749'
'52': '62750'
'53': '62751'
'54': '62753'
'55': '63043'
'56': '63044'
'57': '63045'
'58': '63117'
'59': '63191'
'60': '63208'
'61': '63224'
'62': '63226'
'63': '63287'
'64': '63289'
'65': '63290'
'66': '63291'
'67': '63292'
'68': '63470'
'69': '63471'
'70': '63472'
'71': '63626'
'72': '63655'
'73': '63733'
'74': '63747'
'75': '63755'
'76': '63757'
'77': '63770'
'78': '63789'
'79': '63803'
splits:
- name: train
num_bytes: 89137873.6
num_examples: 1600
- name: test
num_bytes: 22127983.0
num_examples: 400
download_size: 110364015
dataset_size: 111265856.6
---
# Dataset Card for "bw_spec_cls_80_26"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/python-code-instructions-18k-alpaca-standardized_cluster_9_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1418940
num_examples: 3145
download_size: 633095
dataset_size: 1418940
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "python-code-instructions-18k-alpaca-standardized_cluster_9_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AI-Lab-Makerere/beans | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- mit
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- image-classification
task_ids:
- multi-class-image-classification
pretty_name: Beans
dataset_info:
features:
- name: image_file_path
dtype: string
- name: image
dtype: image
- name: labels
dtype:
class_label:
names:
'0': angular_leaf_spot
'1': bean_rust
'2': healthy
splits:
- name: train
num_bytes: 143762054.662
num_examples: 1034
- name: validation
num_bytes: 18515527.0
num_examples: 133
- name: test
num_bytes: 17720308.0
num_examples: 128
download_size: 179978834
dataset_size: 179997889.662
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
# Dataset Card for Beans
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Beans Homepage](https://github.com/AI-Lab-Makerere/ibean/)
- **Repository:** [AI-Lab-Makerere/ibean](https://github.com/AI-Lab-Makerere/ibean/)
- **Paper:** N/A
- **Leaderboard:** N/A
- **Point of Contact:** N/A
### Dataset Summary
Beans leaf dataset with images of diseased and health leaves.
### Supported Tasks and Leaderboards
- `image-classification`: Based on a leaf image, the goal of this task is to predict the disease type (Angular Leaf Spot and Bean Rust), if any.
### Languages
English
## Dataset Structure
### Data Instances
A sample from the training set is provided below:
```
{
'image_file_path': '/root/.cache/huggingface/datasets/downloads/extracted/0aaa78294d4bf5114f58547e48d91b7826649919505379a167decb629aa92b0a/train/bean_rust/bean_rust_train.109.jpg',
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=500x500 at 0x16BAA72A4A8>,
'labels': 1
}
```
### Data Fields
The data instances have the following fields:
- `image_file_path`: a `string` filepath to an image.
- `image`: A `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`.
- `labels`: an `int` classification label.
Class Label Mappings:
```json
{
"angular_leaf_spot": 0,
"bean_rust": 1,
"healthy": 2,
}
```
### Data Splits
| |train|validation|test|
|-------------|----:|---------:|---:|
|# of examples|1034 |133 |128 |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```
@ONLINE {beansdata,
author="Makerere AI Lab",
title="Bean disease dataset",
month="January",
year="2020",
url="https://github.com/AI-Lab-Makerere/ibean/"
}
```
### Contributions
Thanks to [@nateraw](https://github.com/nateraw) for adding this dataset. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.