datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Zombely/wikisource-red | ---
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train_1
num_bytes: 12996760047.0
num_examples: 10000
- name: train_2
num_bytes: 10554030726.546
num_examples: 9998
- name: train_3
num_bytes: 13696109295.506
num_examples: 9999
- name: train_4
num_bytes: 15480963077.0
num_examples: 10000
- name: train_5
num_bytes: 13559162557.0
num_examples: 10000
- name: validation
num_bytes: 2388915116.642
num_examples: 1542
download_size: 2424783685
dataset_size: 68675940819.694
---
# Dataset Card for "wikisource-red"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ovior/twitter_dataset_1713060469 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2279465
num_examples: 7139
download_size: 1275495
dataset_size: 2279465
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MoritzLaurer/zeroshot_test_downsampled | ---
dataset_info:
features:
- name: text
dtype: string
- name: hypothesis
dtype: string
- name: labels
dtype:
class_label:
names:
'0': entailment
'1': not_entailment
- name: task_name
dtype: string
- name: label_text
dtype: string
splits:
- name: mnli_m
num_bytes: 2055427
num_examples: 9815
- name: mnli_mm
num_bytes: 2181179
num_examples: 9832
- name: fevernli
num_bytes: 7532028
num_examples: 19652
- name: anli_r1
num_bytes: 433064
num_examples: 1000
- name: anli_r2
num_bytes: 432927
num_examples: 1000
- name: anli_r3
num_bytes: 501290
num_examples: 1200
- name: wanli
num_bytes: 940472
num_examples: 5000
- name: lingnli
num_bytes: 1078241
num_examples: 4893
- name: wellformedquery
num_bytes: 274932
num_examples: 2000
- name: rottentomatoes
num_bytes: 463520
num_examples: 2000
- name: amazonpolarity
num_bytes: 1073968
num_examples: 2000
- name: imdb
num_bytes: 2807450
num_examples: 2006
- name: yelpreviews
num_bytes: 1581332
num_examples: 2000
- name: hatexplain
num_bytes: 709210
num_examples: 2910
- name: massive
num_bytes: 23680622
num_examples: 172492
- name: banking77
num_bytes: 40009400
num_examples: 221760
- name: emotiondair
num_bytes: 1902532
num_examples: 10344
- name: emocontext
num_bytes: 880077
num_examples: 5340
- name: empathetic
num_bytes: 52141900
num_examples: 81344
- name: agnews
num_bytes: 2544632
num_examples: 8000
- name: yahootopics
num_bytes: 34686310
num_examples: 50000
- name: biasframes_sex
num_bytes: 314648
num_examples: 1510
- name: biasframes_offensive
num_bytes: 465662
num_examples: 2000
- name: biasframes_intent
num_bytes: 438394
num_examples: 2000
- name: financialphrasebank
num_bytes: 515448
num_examples: 2070
- name: appreviews
num_bytes: 604460
num_examples: 2000
- name: hateoffensive
num_bytes: 495508
num_examples: 2586
- name: trueteacher
num_bytes: 2783064
num_examples: 2000
- name: spam
num_bytes: 181876
num_examples: 1262
- name: wikitoxic_toxicaggregated
num_bytes: 923604
num_examples: 2000
- name: wikitoxic_obscene
num_bytes: 894472
num_examples: 2000
- name: wikitoxic_identityhate
num_bytes: 1010608
num_examples: 2000
- name: wikitoxic_threat
num_bytes: 725658
num_examples: 1422
- name: wikitoxic_insult
num_bytes: 833066
num_examples: 2000
- name: manifesto
num_bytes: 300869505
num_examples: 685720
- name: capsotu
num_bytes: 23150995
num_examples: 66444
download_size: 26325656
dataset_size: 512117481
configs:
- config_name: default
data_files:
- split: mnli_m
path: data/mnli_m-*
- split: mnli_mm
path: data/mnli_mm-*
- split: fevernli
path: data/fevernli-*
- split: anli_r1
path: data/anli_r1-*
- split: anli_r2
path: data/anli_r2-*
- split: anli_r3
path: data/anli_r3-*
- split: wanli
path: data/wanli-*
- split: lingnli
path: data/lingnli-*
- split: wellformedquery
path: data/wellformedquery-*
- split: rottentomatoes
path: data/rottentomatoes-*
- split: amazonpolarity
path: data/amazonpolarity-*
- split: imdb
path: data/imdb-*
- split: yelpreviews
path: data/yelpreviews-*
- split: hatexplain
path: data/hatexplain-*
- split: massive
path: data/massive-*
- split: banking77
path: data/banking77-*
- split: emotiondair
path: data/emotiondair-*
- split: emocontext
path: data/emocontext-*
- split: empathetic
path: data/empathetic-*
- split: agnews
path: data/agnews-*
- split: yahootopics
path: data/yahootopics-*
- split: biasframes_sex
path: data/biasframes_sex-*
- split: biasframes_offensive
path: data/biasframes_offensive-*
- split: biasframes_intent
path: data/biasframes_intent-*
- split: financialphrasebank
path: data/financialphrasebank-*
- split: appreviews
path: data/appreviews-*
- split: hateoffensive
path: data/hateoffensive-*
- split: trueteacher
path: data/trueteacher-*
- split: spam
path: data/spam-*
- split: wikitoxic_toxicaggregated
path: data/wikitoxic_toxicaggregated-*
- split: wikitoxic_obscene
path: data/wikitoxic_obscene-*
- split: wikitoxic_identityhate
path: data/wikitoxic_identityhate-*
- split: wikitoxic_threat
path: data/wikitoxic_threat-*
- split: wikitoxic_insult
path: data/wikitoxic_insult-*
- split: manifesto
path: data/manifesto-*
- split: capsotu
path: data/capsotu-*
---
|
jagaldol/chat-foodie | ---
license: cc
---
|
goodemagod/sommy-2.1 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4000
num_examples: 1000
download_size: 715
dataset_size: 4000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jlbaker361/tex_inv_reward_ip_vanilla | ---
dataset_info:
features:
- name: label
dtype: string
- name: tex_inv_reward_ip_prompt_similarity
dtype: float32
- name: tex_inv_reward_ip_identity_consistency
dtype: float32
- name: tex_inv_reward_ip_negative_prompt_similarity
dtype: float32
- name: tex_inv_reward_ip_target_prompt_similarity
dtype: float32
- name: tex_inv_reward_ip_aesthetic_score
dtype: float32
splits:
- name: train
num_bytes: 308
num_examples: 11
download_size: 4333
dataset_size: 308
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tasksource/QAmeleon | ---
license: cc-by-4.0
---
|
irds/mmarco_v2_ru | ---
pretty_name: '`mmarco/v2/ru`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `mmarco/v2/ru`
The `mmarco/v2/ru` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mmarco#mmarco/v2/ru).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=8,841,823
This dataset is used by: [`mmarco_v2_ru_dev`](https://huggingface.co/datasets/irds/mmarco_v2_ru_dev), [`mmarco_v2_ru_train`](https://huggingface.co/datasets/irds/mmarco_v2_ru_train)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/mmarco_v2_ru', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Bonifacio2021MMarco,
title={{mMARCO}: A Multilingual Version of {MS MARCO} Passage Ranking Dataset},
author={Luiz Henrique Bonifacio and Israel Campiotti and Roberto Lotufo and Rodrigo Nogueira},
year={2021},
journal={arXiv:2108.13897}
}
```
|
vmadhuvarshi/dataset | ---
license: mit
---
|
SaladSlayer00/twin_matcher | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: string
splits:
- name: train
num_bytes: 361982594.452
num_examples: 8962
- name: test
num_bytes: 3592023.0
num_examples: 92
download_size: 365589841
dataset_size: 365574617.452
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
result-kand2-sdxl-wuerst-karlo/eda9bdbf | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 167
num_examples: 10
download_size: 1318
dataset_size: 167
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "eda9bdbf"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/python3-standardized_cluster_24_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 7812284
num_examples: 2530
download_size: 1336994
dataset_size: 7812284
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "python3-standardized_cluster_24_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
japanese-asr/whisper_transcriptions.reazonspeech.all_36 | ---
dataset_info:
config_name: all
features:
- name: name
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: whisper_transcript
sequence: int64
splits:
- name: train
num_bytes: 30568864409.0
num_examples: 268338
download_size: 30326426254
dataset_size: 30568864409.0
configs:
- config_name: all
data_files:
- split: train
path: all/train-*
---
|
mteb/sts15-sts | ---
language:
- en
--- |
jg583/NSynth | ---
license: cc-by-4.0
dataset_info:
features:
- name: id
dtype: string
- name: note
dtype: int64
- name: note_str
dtype: string
- name: instrument
dtype: int64
- name: instrument_str
dtype: string
- name: pitch
dtype: int64
- name: velocity
dtype: int64
- name: sample_rate
dtype: int64
- name: qualities
sequence: int64
- name: qualities_str
sequence: string
- name: instrument_family
dtype: int64
- name: instrument_family_str
dtype: string
- name: instrument_source
dtype: int64
- name: instrument_source_str
dtype: string
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 129511245
num_examples: 289205
- name: validation
num_bytes: 5679042
num_examples: 12678
- name: test
num_bytes: 1830670
num_examples: 4096
download_size: 25233566634
dataset_size: 137020957
task_categories:
- audio-to-audio
- audio-classification
tags:
- music
pretty_name: NSynth
size_categories:
- 100K<n<1M
---
# Dataset Card for NSynth
<!-- Provide a quick summary of the dataset. -->
The NSynth dataset is an audio dataset containing over 300,000 musical notes across over 1000 commercially-sampled instruments, distinguished by pitch, timbre, and envelope. Each recording was made by playing and holding a musical note for three seconds and letting it decay for one second. The collection of four-second recordings ranges over every pitch on a standard MIDI piano (or as many as possible for the given instrument), played at five different velocities.
This dataset was created as an attempt to establish a high-quality entry point into audio machine learning, in response to the surge of breakthroughs in generative modeling of images due to the abundance of approachable image datasets (MNIST, CIFAR, ImageNet). NSynth is meant to be both a benchmark for audio ML and a foundation to be expanded on with future datasets.
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
Since some instruments are not capable of producing all 88 pitches in the MIDI piano's range, there is an average of 65.4 pitches per instrument. Furthermore, the commercial sample packs occasionally contain duplicate sounds across multiple velocities, leaving an average of 4.75 unique velocities per pitch.
Each of the notes is annotated with three additional pieces of information based on a combination of human evaluation and heuristic algorithms:
1. Source: The method of sound production for the note’s instrument. This can be one of `acoustic` or `electronic` for instruments that were recorded from acoustic or electronic instruments, respectively, or `synthetic` for synthesized instruments.
|Index|ID|
|:----|:----|
|0|acoustic|
|1|electronic|
|2|synthetic|
2. Family: The high-level family of which the note’s instrument is a member. Each instrument is a member of exactly one family. See the complete list of families and their frequencies by source below.
|**Index**|**ID**|
|:---|:---|
|0|bass|
|1|brass|
|2|flute|
|3|guitar|
|4|keyboard|
|5|mallet|
|6|organ|
|7|reed|
|8|string|
|9|synth_lead|
|10|vocal|
|**Family**|**Acoustic**|**Electronic**|**Synthetic**|**Total**|
|:----|:----|:----|:----|:----|
|Bass|200|8387|60368|68955|
|Brass|13760|70|0|13830|
|Flute|6572|35|2816|9423|
|Guitar|13343|16805|5275|35423|
|Keyboard|8508|42645|3838|54991|
|Mallet|27722|5581|1763|35066|
|Organ|176|36401|0|36577|
|Reed|14262|76|528|14866|
|String|20510|84|0|20594|
|Synth Lead|0|0|5501|5501|
|Vocal|3925|140|6688|10753|
|**Total**|108978|110224|86777|305979|
3. Qualities: Sonic qualities of the note. See below for descriptions of the qualities, and [here](https://magenta.tensorflow.org/datasets/nsynth#quality-co-occurrences) for information on co-occurences between qualities.
|**Index**|**ID**|**Description**|
|:----|:----|:----|
|0|`bright`|A large amount of high frequency content and strong upper harmonics.|
|1|`dark`|A distinct lack of high frequency content, giving a muted and bassy sound. Also sometimes described as ‘Warm’.|
|2|`distortion`|Waveshaping that produces a distinctive crunchy sound and presence of many harmonics. Sometimes paired with non-harmonic noise.|
|3|`fast_decay`|Amplitude envelope of all harmonics decays substantially before the ‘note-off’ point at 3 seconds.|
|4|`long_release`|Amplitude envelope decays slowly after the ‘note-off’ point, sometimes still present at the end of the sample 4 seconds.|
|5|`multiphonic`|Presence of overtone frequencies related to more than one fundamental frequency.|
|6|`nonlinear_env`|Modulation of the sound with a distinct envelope behavior different than the monotonic decrease of the note. Can also include filter envelopes as well as dynamic envelopes.|
|7|`percussive`|A loud non-harmonic sound at note onset.|
|8|`reverb`|Room acoustics that were not able to be removed from the original sample.|
|9|`tempo-synced`|Rhythmic modulation of the sound to a fixed tempo.|
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Homepage:** https://magenta.tensorflow.org/datasets/nsynth
- **Paper:** https://arxiv.org/abs/1704.01279
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
This dataset has seen much use in models for generating audio, and some of these models have even been used by high-profile artists. Another obvious application of the dataset could be for classification (identifying instruments or perhaps even qualities of music, which could be useful in things like music recommendation). See [here](https://colab.research.google.com/drive/16u5dvqWxA7o9S0iC6E8B3S77piFZ0BYL#scrollTo=Q5BGqIb87Pek&uniqifier=2) one such example (which is a work in progress).
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
The dataset has three splits:
* Train: A training set with 289,205 examples. Instruments do not overlap with valid or test.
* Valid: A validation set with 12,678 examples. Instruments do not overlap with train.
* Test: A test set with 4,096 examples. Instruments do not overlap with train.
See below for descriptions of the features.
|Feature|Type|Description|
|:----|:----|:----|
|note|`int64`|A unique integer identifier for the note.|
|note_str|`str`|A unique string identifier for the note in the format `<instrument_str>-<pitch>-<velocity>`.|
|instrument|`int64`|A unique, sequential identifier for the instrument the note was synthesized from.|
|instrument_str|`str`|A unique string identifier for the instrument this note was synthesized from in the format `<instrument_family_str>-<instrument_production_str>-<instrument_name>`.|
|pitch|`int64`|The 0-based MIDI pitch in the range \[0, 127\].|
|velocity|`int64`|The 0-based MIDI velocity in the range \[0, 127\].|
|sample_rate|`int64`|The samples per second for the audio feature.|
|qualities|`[int64]`|A binary vector representing which sonic qualities are present in this note.|
|qualities_str|`[str]`|A list IDs of which qualities are present in this note selected from the sonic qualities list.|
|instrument_family|`int64`|The index of the instrument family this instrument is a member of.|
|instrument_family_str|`str`|The ID of the instrument family this instrument is a member of.|
|instrument_source|`int64`|The index of the sonic source for this instrument.|
|instrument_source_str|`str`|The ID of the sonic source for this instrument.|
|audio|`{'path': str, 'array': [float], 'sampling_rate': int64}`|A dictionary containing a path to the corresponding audio file, a list of audio samples represented as floating point values in the range \[-1,1\], and the sampling rate.|
An example instance generated with the loading script (note that this differs from the example instance on the homepage, as the script integrates the audio into the respective JSON files):
```
{'note': 84147,
'note_str': 'bass_synthetic_033-035-050',
'instrument': 417,
'instrument_str': 'bass_synthetic_033',
'pitch': 35,
'velocity': 50,
'sample_rate': 16000,
'qualities': [0, 1, 0, 0, 0, 0, 0, 0, 0, 0],
'qualities_str': ['dark'],
'instrument_family': 0,
'instrument_family_str': 'bass',
'instrument_source': 2,
'instrument_source_str': 'synthetic',
'audio': {'path': '/root/.cache/huggingface/datasets/downloads/extracted/335ef507846fb65b0b87154c22cefd1fe87ea83e8253ef1f72648a3fdfac9a5f/nsynth-test/audio/bass_synthetic_033-035-050.wav',
'array': array([0., 0., 0., ..., 0., 0., 0.]),
'sampling_rate': 16000}
}
```
## Potential Shortcomings
There are quite a few family-source pairings with little or no representation. While this is understandable in some cases - no acoustic Synth Lead, for instance - it may be problematic in others (no synthetic brass, strings, nor organ, < 100 electronic brass, flute, reed, and string samples). This can be particularly troublesome in classification problems, as there may not be sufficient data for a model to correctly distinguish between sources for a particular family of instruments. In music generation, on the other hand, these disparities may yield a bias toward the use of one source over others for a given family.
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
```
Jesse Engel, Cinjon Resnick, Adam Roberts, Sander Dieleman, Douglas Eck,
Karen Simonyan, and Mohammad Norouzi. "Neural Audio Synthesis of Musical Notes
with WaveNet Autoencoders." 2017.
```
**BibTeX:**
```
@misc{nsynth2017,
Author = {Jesse Engel and Cinjon Resnick and Adam Roberts and
Sander Dieleman and Douglas Eck and Karen Simonyan and
Mohammad Norouzi},
Title = {Neural Audio Synthesis of Musical Notes with WaveNet Autoencoders},
Year = {2017},
Eprint = {arXiv:1704.01279},
}
```
## Dataset Card Authors
John Gillen |
kinianlo/wikipedia_pos_tagged | ---
dataset_info:
- config_name: 20220301_en_nltk
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: pos_tags
sequence:
sequence:
sequence: string
splits:
- name: train
num_bytes: 88585221192
num_examples: 6458670
download_size: 3527644902
dataset_size: 88585221192
- config_name: 20220301_en_nltk_tags_only
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: pos_tags
sequence:
sequence:
sequence: string
splits:
- name: train
num_bytes: 68920385173
num_examples: 6458670
download_size: 0
dataset_size: 68920385173
- config_name: 20220301_simple_nltk
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: pos_tags
sequence:
sequence:
sequence: string
splits:
- name: train
num_bytes: 1000903680
num_examples: 205328
download_size: 286763992
dataset_size: 1000903680
- config_name: 20220301_simple_nltk_tags_only
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: pos_tags
sequence:
sequence:
sequence: string
splits:
- name: train
num_bytes: 783729741
num_examples: 205328
download_size: 161414334
dataset_size: 783729741
- config_name: 20220301_simple_spacy
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: pos_tags
sequence:
sequence:
sequence: string
splits:
- name: train
num_bytes: 1131814443
num_examples: 205328
download_size: 289479815
dataset_size: 1131814443
- config_name: 20220301_simple_spacy_tags_only
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: pos_tags
sequence:
sequence:
sequence: string
splits:
- name: train
num_bytes: 914640504
num_examples: 205328
download_size: 164284823
dataset_size: 914640504
configs:
- config_name: 20220301_en_nltk
data_files:
- split: train
path: 20220301_en_nltk/train-*
- config_name: 20220301_en_nltk_tags_only
data_files:
- split: train
path: 20220301_en_nltk_tags_only/train-*
- config_name: 20220301_simple_nltk
data_files:
- split: train
path: 20220301_simple_nltk/train-*
- config_name: 20220301_simple_nltk_tags_only
data_files:
- split: train
path: 20220301_simple_nltk_tags_only/train-*
- config_name: 20220301_simple_spacy
data_files:
- split: train
path: 20220301_simple_spacy/train-*
- config_name: 20220301_simple_spacy_tags_only
data_files:
- split: train
path: 20220301_simple_spacy_tags_only/train-*
---
# Dataset Card for "wikipedia_pos_tagged"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Circularmachines/Batch_indexing_machine_pred_csv | ---
license: cc-by-4.0
---
|
open-llm-leaderboard/details_0-hero__Matter-0.1-7B-boost | ---
pretty_name: Evaluation run of 0-hero/Matter-0.1-7B-boost
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [0-hero/Matter-0.1-7B-boost](https://huggingface.co/0-hero/Matter-0.1-7B-boost)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_0-hero__Matter-0.1-7B-boost\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-22T00:38:52.022465](https://huggingface.co/datasets/open-llm-leaderboard/details_0-hero__Matter-0.1-7B-boost/blob/main/results_2024-03-22T00-38-52.022465.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6183547909386493,\n\
\ \"acc_stderr\": 0.032866006884011485,\n \"acc_norm\": 0.6230999773290488,\n\
\ \"acc_norm_stderr\": 0.0335251160561431,\n \"mc1\": 0.38555691554467564,\n\
\ \"mc1_stderr\": 0.017038839010591673,\n \"mc2\": 0.5470240437403702,\n\
\ \"mc2_stderr\": 0.015336831369535142\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5861774744027304,\n \"acc_stderr\": 0.014392730009221005,\n\
\ \"acc_norm\": 0.6262798634812287,\n \"acc_norm_stderr\": 0.014137708601759091\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6265684126667994,\n\
\ \"acc_stderr\": 0.004827266662144028,\n \"acc_norm\": 0.8150766779525991,\n\
\ \"acc_norm_stderr\": 0.003874419065658617\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926605,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926605\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n\
\ \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n\
\ \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383887,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383887\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.03267151848924777,\n\
\ \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.03267151848924777\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37037037037037035,\n \"acc_stderr\": 0.024870815251057093,\n \"\
acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.024870815251057093\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n\
\ \"acc_stderr\": 0.024472243840895525,\n \"acc_norm\": 0.7548387096774194,\n\
\ \"acc_norm_stderr\": 0.024472243840895525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153324,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153324\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.024756000382130952,\n\
\ \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.024756000382130952\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114982,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114982\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6260504201680672,\n \"acc_stderr\": 0.03142946637883708,\n \
\ \"acc_norm\": 0.6260504201680672,\n \"acc_norm_stderr\": 0.03142946637883708\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8348623853211009,\n \"acc_stderr\": 0.01591955782997604,\n \"\
acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.01591955782997604\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.47685185185185186,\n \"acc_stderr\": 0.034063153607115065,\n \"\
acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.034063153607115065\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967407,\n \"\
acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967407\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676177,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676177\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8007662835249042,\n\
\ \"acc_stderr\": 0.014283378044296413,\n \"acc_norm\": 0.8007662835249042,\n\
\ \"acc_norm_stderr\": 0.014283378044296413\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.025416003773165545,\n\
\ \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.025416003773165545\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.441340782122905,\n\
\ \"acc_stderr\": 0.016607021781050873,\n \"acc_norm\": 0.441340782122905,\n\
\ \"acc_norm_stderr\": 0.016607021781050873\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495026,\n\
\ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495026\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44132985658409385,\n\
\ \"acc_stderr\": 0.01268201633564667,\n \"acc_norm\": 0.44132985658409385,\n\
\ \"acc_norm_stderr\": 0.01268201633564667\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.02972215209928007,\n\
\ \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.02972215209928007\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6290849673202614,\n \"acc_stderr\": 0.019542101564854128,\n \
\ \"acc_norm\": 0.6290849673202614,\n \"acc_norm_stderr\": 0.019542101564854128\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38555691554467564,\n\
\ \"mc1_stderr\": 0.017038839010591673,\n \"mc2\": 0.5470240437403702,\n\
\ \"mc2_stderr\": 0.015336831369535142\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7592738752959748,\n \"acc_stderr\": 0.012015559212224176\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.42608036391205456,\n \
\ \"acc_stderr\": 0.013621144396086707\n }\n}\n```"
repo_url: https://huggingface.co/0-hero/Matter-0.1-7B-boost
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|arc:challenge|25_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|gsm8k|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hellaswag|10_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T00-38-52.022465.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T00-38-52.022465.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- '**/details_harness|winogrande|5_2024-03-22T00-38-52.022465.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-22T00-38-52.022465.parquet'
- config_name: results
data_files:
- split: 2024_03_22T00_38_52.022465
path:
- results_2024-03-22T00-38-52.022465.parquet
- split: latest
path:
- results_2024-03-22T00-38-52.022465.parquet
---
# Dataset Card for Evaluation run of 0-hero/Matter-0.1-7B-boost
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [0-hero/Matter-0.1-7B-boost](https://huggingface.co/0-hero/Matter-0.1-7B-boost) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_0-hero__Matter-0.1-7B-boost",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-22T00:38:52.022465](https://huggingface.co/datasets/open-llm-leaderboard/details_0-hero__Matter-0.1-7B-boost/blob/main/results_2024-03-22T00-38-52.022465.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6183547909386493,
"acc_stderr": 0.032866006884011485,
"acc_norm": 0.6230999773290488,
"acc_norm_stderr": 0.0335251160561431,
"mc1": 0.38555691554467564,
"mc1_stderr": 0.017038839010591673,
"mc2": 0.5470240437403702,
"mc2_stderr": 0.015336831369535142
},
"harness|arc:challenge|25": {
"acc": 0.5861774744027304,
"acc_stderr": 0.014392730009221005,
"acc_norm": 0.6262798634812287,
"acc_norm_stderr": 0.014137708601759091
},
"harness|hellaswag|10": {
"acc": 0.6265684126667994,
"acc_stderr": 0.004827266662144028,
"acc_norm": 0.8150766779525991,
"acc_norm_stderr": 0.003874419065658617
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926605,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.038990736873573344,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.038990736873573344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383887,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383887
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.03267151848924777,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.03267151848924777
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.024870815251057093,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.024870815251057093
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.024472243840895525,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.024472243840895525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153324,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153324
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.024756000382130952,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.024756000382130952
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114982,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114982
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6260504201680672,
"acc_stderr": 0.03142946637883708,
"acc_norm": 0.6260504201680672,
"acc_norm_stderr": 0.03142946637883708
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.01591955782997604,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.01591955782997604
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.034063153607115065,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.034063153607115065
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967407,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967407
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676177,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676177
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8007662835249042,
"acc_stderr": 0.014283378044296413,
"acc_norm": 0.8007662835249042,
"acc_norm_stderr": 0.014283378044296413
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.025416003773165545,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.025416003773165545
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.441340782122905,
"acc_stderr": 0.016607021781050873,
"acc_norm": 0.441340782122905,
"acc_norm_stderr": 0.016607021781050873
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495026,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495026
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44132985658409385,
"acc_stderr": 0.01268201633564667,
"acc_norm": 0.44132985658409385,
"acc_norm_stderr": 0.01268201633564667
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.02972215209928007,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.02972215209928007
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6290849673202614,
"acc_stderr": 0.019542101564854128,
"acc_norm": 0.6290849673202614,
"acc_norm_stderr": 0.019542101564854128
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304328,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304328
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.38555691554467564,
"mc1_stderr": 0.017038839010591673,
"mc2": 0.5470240437403702,
"mc2_stderr": 0.015336831369535142
},
"harness|winogrande|5": {
"acc": 0.7592738752959748,
"acc_stderr": 0.012015559212224176
},
"harness|gsm8k|5": {
"acc": 0.42608036391205456,
"acc_stderr": 0.013621144396086707
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
qgallouedec/prj_gia_dataset_metaworld_reach_v2_1111 | ---
library_name: gia
tags:
- deep-reinforcement-learning
- reinforcement-learning
- gia
- multi-task
- multi-modal
- imitation-learning
- offline-reinforcement-learning
---
An imitation learning environment for the reach-v2 environment, sample for the policy reach-v2
This environment was created as part of the Generally Intelligent Agents project gia: https://github.com/huggingface/gia
## Load dataset
First, clone it with
```sh
git clone https://huggingface.co/datasets/qgallouedec/prj_gia_dataset_metaworld_reach_v2_1111
```
Then, load it with
```python
import numpy as np
dataset = np.load("prj_gia_dataset_metaworld_reach_v2_1111/dataset.npy", allow_pickle=True).item()
print(dataset.keys()) # dict_keys(['observations', 'actions', 'dones', 'rewards'])
```
|
ibranze/araproje_hellaswag_tr_conf_mgpt_worstscore | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 162703.0
num_examples: 250
download_size: 86961
dataset_size: 162703.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_tr_conf_mgpt_worstscore"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
paul-w-qs/handling_charges_current_v1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: JSON_LABEL
dtype: string
splits:
- name: train
num_bytes: 363782668.918
num_examples: 1093
download_size: 362595207
dataset_size: 363782668.918
---
# Dataset Card for "handling_charges_current_v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/metatree_RandomRBF_50_1E_3 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: X
sequence: float64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 69928800
num_examples: 699288
- name: validation
num_bytes: 30071200
num_examples: 300712
download_size: 103917487
dataset_size: 100000000
---
# Dataset Card for "metatree_RandomRBF_50_1E_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mickylan2367/GraySpectrogram | ---
license: cc-by-sa-4.0
language:
- en
tags:
- music
size_categories:
- 1K<n<10K
---
# Google/Music-Capsの音声データをスペクトログラム化したデータ。
* Music Cpasとは:https://huggingface.co/datasets/google/MusicCaps
* GrayScaleじゃないほうもあるから見てね(⋈◍>◡<◍)。✧♡(<a href="https://huggingface.co/datasets/mickylan2367/ColorSpectrogram">これ</a>)
## 基本情報
* sampling_rate: int = 44100
* 20秒のwavファイル -> 1600×800のpngファイルへ変換
* librosaの規格により、画像の縦軸:(0-10000?Hz), 画像の横軸:(0-40秒)
* 詳しくはlibrosa.specshow() -> https://librosa.org/doc/main/auto_examples/plot_display.html
## 使い方
### 0: データセットをダウンロード
```py
from datasets import load_dataset
data = load_dataset("mickylan2367/spectrogram")
data = data["train"]
```
### 1: データローダーへ
* こんな感じの関数で、データローダーにできます。
```py
from torchvision import transforms
from torch.utils.data import DataLoader
BATCH_SIZE = ??? # 自分で設定
IMAGE_SIZE = ???
TRAIN_SIZE = ??? # 訓練に使用したいデータセット数
TEST_SIZE = ??? # テストに使用したいデータセット数
def load_datasets():
data_transforms = [
transforms.Resize((IMG_SIZE, IMG_SIZE)),
transforms.ToTensor(), # Scales data into [0,1]
transforms.Lambda(lambda t: (t * 2) - 1) # Scale between [-1, 1]
]
data_transform = transforms.Compose(data_transforms)
data = load_dataset("mickylan2367/spectrogram")
data = data["train"]
train = data[slice(0, TRAIN_SIZE, None)]
test = data[slice(TRAIN_SIZE, TRAIN_SIZE + TEST_SIZE, 0)]
for idx in range(len(train["image"])):
train["image"][idx] = data_transform(train["image"][idx])
test["image"][idx] = data_transform(test["image"][idx])
train = Dataset.from_dict(train)
train = train.with_format("torch") # リスト型回避
test = Dataset.from_dict(train)
test = test.with_format("torch") # リスト型回避
# or
train_loader = DataLoader(train, batch_size=BATCH_SIZE, shuffle=True, drop_last=True)
test_loader = DataLoader(test, batch_size=BATCH_SIZE, shuffle=True, drop_last=True)
return train_loader, test_loader
```
## 参考資料とメモ
* (memo)ぶっちゃけグレースケールもカラーバージョンをtorchvision.transformのグレースケール変換すればいいだけかも?
* ダウンロードに使ったコードは<a href="https://colab.research.google.com/drive/1HmDorbxD5g6C2WDjLierUqbhecTdRvgA?usp=sharing">こちら</a>
* 参考:https://www.kaggle.com/code/osanseviero/musiccaps-explorer
* 仕組み:Kaggleの参考コードでwavファイルをダウンロードする->スペクトログラムつくりながらmetadata.jsonlに
```
{"filename":"spectrogram_*.png", "caption":"This is beautiful music"}
```
をなどと言ったjson列を書き込み、これをアップロードした
* Huggingfaceのデータビューアが動かなくなったら、一度GoogleColabでそのデータセットをダウンロードしてみることもおすすめ
意外とHuggingfaceがバグっているだけかも(実話(´;ω;`)) |
ItsMayur/Vehicle_Complaints_NHSTA | ---
license: mit
language:
- en
tags:
- legal
- vehicle data
- nlp
- t5
- text summarization
pretty_name: Vehicle Complaints for NLP
size_categories:
- n<1K
--- |
Doutran/datasetcd1mylla | ---
license: openrail
---
|
RichardErkhov/OneMillionFaces | ---
license: mit
task_categories:
- image-to-image
pretty_name: One million faces
size_categories:
- 1M<n<10M
---
# million-faces
Welcome to "million-faces", one of the largest facesets available to the public. Comprising a staggering one million faces, all images in this dataset are entirely AI-generated.
Due to the nature of AI-generated images, please be aware that some artifacts may be present in the dataset.
The dataset is currently being uploaded to Hugging Face, a renowned platform for hosting datasets and models for the machine learning community.
## Usage
Feel free to use this dataset for your projects and research. However, please do not hold me liable for any issues that might arise from its use. If you use this dataset and create something amazing, consider linking back to this GitHub project. Recognition of work is a pillar of the open-source community!
## Dataset Details
- **Number of faces:** 1,000,000
- **Source:** AI-generated
- **Artifacts:** Some images may contain artifacts
- **Availability:** Fully uploaded on Hugging Face
## About
This project is about creating and sharing one of the largest AI-generated facesets. With one million faces, it offers a significant resource for researchers and developers in AI, machine learning, and computer vision. |
misitetong/boat-kg | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 624626
num_examples: 5538
download_size: 389912
dataset_size: 624626
---
# Dataset Card for "boat-kg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nguyenthanhdo/viettel_v3 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: translated
dtype: bool
- name: output_len
dtype: int64
- name: source
dtype: string
splits:
- name: train
num_bytes: 172800903.0
num_examples: 60000
download_size: 84019395
dataset_size: 172800903.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "viettel_v3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
noza-kit/wmt23_enjp_train_enpt_ex1 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: en
dtype: string
- name: jp
dtype: string
splits:
- name: train
num_bytes: 9523975
num_examples: 41844
download_size: 4629898
dataset_size: 9523975
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
distilled-from-one-sec-cv12/chunk_2 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 915302464
num_examples: 178352
download_size: 931494167
dataset_size: 915302464
---
# Dataset Card for "chunk_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sandeepv/sfinetunning | ---
license: apache-2.0
---
|
patrickvonplaten/librispeech_asr_self_contained | ---
pretty_name: LibriSpeech
annotations_creators:
- expert-generated
language_creators:
- crowdsourced
- expert-generated
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
paperswithcode_id: librispeech-1
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- automatic-speech-recognition
- audio-classification
task_ids:
- audio-speaker-identification
---
# Dataset Card for librispeech_asr
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [LibriSpeech ASR corpus](http://www.openslr.org/12)
- **Repository:** [Needs More Information]
- **Paper:** [LibriSpeech: An ASR Corpus Based On Public Domain Audio Books](https://www.danielpovey.com/files/2015_icassp_librispeech.pdf)
- **Leaderboard:** [Paperswithcode Leaderboard](https://paperswithcode.com/sota/speech-recognition-on-librispeech-test-other)
- **Point of Contact:** [Daniel Povey](mailto:dpovey@gmail.com)
### Dataset Summary
LibriSpeech is a corpus of approximately 1000 hours of 16kHz read English speech, prepared by Vassil Panayotov with the assistance of Daniel Povey. The data is derived from read audiobooks from the LibriVox project, and has been carefully segmented and aligned.
### Supported Tasks and Leaderboards
- `automatic-speech-recognition`, `audio-speaker-identification`: The dataset can be used to train a model for Automatic Speech Recognition (ASR). The model is presented with an audio file and asked to transcribe the audio file to written text. The most common evaluation metric is the word error rate (WER). The task has an active leaderboard which can be found at https://paperswithcode.com/sota/speech-recognition-on-librispeech-test-clean and ranks models based on their WER.
### Languages
The audio is in English. There are two configurations: `clean` and `other`.
The speakers in the corpus were ranked according to the WER of the transcripts of a model trained on
a different dataset, and were divided roughly in the middle,
with the lower-WER speakers designated as "clean" and the higher WER speakers designated as "other".
## Dataset Structure
### Data Instances
A typical data point comprises the path to the audio file, usually called `file` and its transcription, called `text`. Some additional information about the speaker and the passage which contains the transcription is provided.
```
{'chapter_id': 141231,
'file': '/home/patrick/.cache/huggingface/datasets/downloads/extracted/b7ded9969e09942ab65313e691e6fc2e12066192ee8527e21d634aca128afbe2/dev_clean/1272/141231/1272-141231-0000.flac',
'audio': {'path': '/home/patrick/.cache/huggingface/datasets/downloads/extracted/b7ded9969e09942ab65313e691e6fc2e12066192ee8527e21d634aca128afbe2/dev_clean/1272/141231/1272-141231-0000.flac',
'array': array([-0.00048828, -0.00018311, -0.00137329, ..., 0.00079346,
0.00091553, 0.00085449], dtype=float32),
'sampling_rate': 16000},
'id': '1272-141231-0000',
'speaker_id': 1272,
'text': 'A MAN SAID TO THE UNIVERSE SIR I EXIST'}
```
### Data Fields
- file: A path to the downloaded audio file in .flac format.
- audio: A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`.
- text: the transcription of the audio file.
- id: unique id of the data sample.
- speaker_id: unique id of the speaker. The same speaker id can be found for multiple data samples.
- chapter_id: id of the audiobook chapter which includes the transcription.
### Data Splits
The size of the corpus makes it impractical, or at least inconvenient
for some users, to distribute it as a single large archive. Thus the
training portion of the corpus is split into three subsets, with approximate size 100, 360 and 500 hours respectively.
A simple automatic
procedure was used to select the audio in the first two sets to be, on
average, of higher recording quality and with accents closer to US
English. An acoustic model was trained on WSJ’s si-84 data subset
and was used to recognize the audio in the corpus, using a bigram
LM estimated on the text of the respective books. We computed the
Word Error Rate (WER) of this automatic transcript relative to our
reference transcripts obtained from the book texts.
The speakers in the corpus were ranked according to the WER of
the WSJ model’s transcripts, and were divided roughly in the middle,
with the lower-WER speakers designated as "clean" and the higher-WER speakers designated as "other".
For "clean", the data is split into train, validation, and test set. The train set is further split into train.100 and train.360
respectively accounting for 100h and 360h of the training data.
For "other", the data is split into train, validation, and test set. The train set contains approximately 500h of recorded speech.
| | Train.500 | Train.360 | Train.100 | Valid | Test |
| ----- | ------ | ----- | ---- | ---- | ---- |
| clean | - | 104014 | 28539 | 2703 | 2620|
| other | 148688 | - | - | 2864 | 2939 |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in this dataset.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
The dataset was initially created by Vassil Panayotov, Guoguo Chen, Daniel Povey, and Sanjeev Khudanpur.
### Licensing Information
[CC BY 4.0](https://creativecommons.org/licenses/by/4.0/)
### Citation Information
```
@inproceedings{panayotov2015librispeech,
title={Librispeech: an ASR corpus based on public domain audio books},
author={Panayotov, Vassil and Chen, Guoguo and Povey, Daniel and Khudanpur, Sanjeev},
booktitle={Acoustics, Speech and Signal Processing (ICASSP), 2015 IEEE International Conference on},
pages={5206--5210},
year={2015},
organization={IEEE}
}
```
### Contributions
Thanks to [@patrickvonplaten](https://github.com/patrickvonplaten) for adding this dataset.
|
poorguys/TW-Kai_components_256 | ---
dataset_info:
features:
- name: id
dtype: int32
- name: images
dtype: image
- name: part
dtype: int32
- name: storke
dtype: int32
- name: usage_count
dtype: int32
splits:
- name: train
num_bytes: 896761.0
num_examples: 517
download_size: 848341
dataset_size: 896761.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Hitochu/hate-speech-en | ---
license: wtfpl
---
```
{
"label": {
0: "normal",
1: "offensive",
2: "hateful",
3: "abusive",
4: "fearful",
5: "disrespectful",
99: "unknown"
},
"tweet": <string>
}
``` |
realfolkcode/open-music-practice-demo | ---
license: cc-by-4.0
---
|
GEM-submissions/lewtun__this-is-a-test-name__1655914374 | ---
benchmark: gem
type: prediction
submission_name: This is a test name
tags:
- evaluation
- benchmark
---
# GEM Submission
Submission name: This is a test name
|
Kishorereddy123/transformed_QA | ---
dataset_info:
features:
- name: Question_Answer
dtype: string
splits:
- name: train
num_bytes: 51479
num_examples: 86
download_size: 27643
dataset_size: 51479
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
distilled-from-one-sec-cv12/chunk_205 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1149568000
num_examples: 224000
download_size: 1174781399
dataset_size: 1149568000
---
# Dataset Card for "chunk_205"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
benayas/snips_artificial_5pct_v0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1112869
num_examples: 13084
download_size: 402550
dataset_size: 1112869
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Coldog2333/tiage | ---
license: mit
language:
- en
tags:
- dialogue segmentation
size_categories:
- n<1K
---
# Dataset Card for SuperDialseg
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://github.com/HuiyuanXie/tiage](https://github.com/HuiyuanXie/tiage)
- **Repository:** [https://github.com/HuiyuanXie/tiage](https://github.com/HuiyuanXie/tiage)
- **Paper:** TIAGE: A Benchmark for Topic-Shift Aware Dialog Modeling
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages: English
## Dataset Structure
### Data Instances
```
{
"dial_data": {
"tiage": [
{
"dial_id": "tiage_dial_001",
"turns": [
{
"da": "",
"role": "user",
"turn_id": 0,
"utterance": "hello , how are you doing tonight ?",
"topic_id": 0,
"segmentation_label": 0
},
...
{
"da": "",
"role": "user",
"turn_id": 15,
"utterance": "i bet it is oh i could not",
"topic_id": 4,
"segmentation_label": 1
}
],
...
}
]
}
```
### Data Fields
#### Dialogue-Level
+ `dial_id`: ID of a dialogue;
+ `turns`: All utterances of a dialogue.
#### Utterance-Level
+ `da`: Dialogue Act annotation (here is Null);
+ `role`: Role annotation (here is user/agent/user/agent... in default);
+ `turn_id`: ID of an utterance;
+ `utterance`: Text of the utterance;
+ `topic_id`: ID (order) of the current topic;
+ `segmentation_label`: 1: it is the end of a topic; 0: others.
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
MIT
### Citation Information
```
@article{xie2021tiage,
title={TIAGE: A Benchmark for Topic-Shift Aware Dialog Modeling},
author={Xie, Huiyuan and Liu, Zhenghao and Xiong, Chenyan and Liu, Zhiyuan and Copestake, Ann},
journal={arXiv preprint arXiv:2109.04562},
year={2021}
}
```
### Contributions
+ Thanks to [@HuiyuanXie](https://github.com/HuiyuanXie/) for collecting this dataset.
+ Thanks to [@Coldog2333](https://github.com/Coldog2333) for adding this dataset. |
Lit4pCol4b/primer_demo_ejemplo_ds_semantic |
---
task_categories:
- image-segmentation
---
# Dataset Card for primer_demo_ejemplo_ds_semantic
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset description](#dataset-description)
- [Dataset categories](#dataset-categories)
## Dataset description
- **Homepage:** https://huggingface.co/datasets/Lit4pCol4b/primer_demo_ejemplo_ds_semantic
## Dataset categories
| Id | Name | Description |
| --- | ---- | ----------- |
| 1 | name_1 | - |
| 2 | name_2 | - |
| 3 | name_3 | - |
|
Robette/Murgia | ---
license: openrail
---
|
Anusha64/NedData | ---
license: apache-2.0
---
|
ticoAg/zhihu_3k_rlhf_train | ---
license: apache-2.0
task_categories:
- question-answering
language:
- zh
size_categories:
- 1K<n<10K
---
# Note
> some rm data from public dataset
- format
```json
{
"history": [
"query1", "answer1",
"query2", "answer2"
],
"prompt": "query",
"input": "input for query",
"output": [
"output rank1",
"output rank2",
"output rank3"
]
}
```
Thanks
- [beyond/rlhf-reward-single-round-trans_chinese](https://huggingface.co/datasets/beyond/rlhf-reward-single-round-trans_chinese) :
- [dikw/hh_rlhf_cn](https://huggingface.co/datasets/dikw/hh_rlhf_cn)
- [liyucheng/zhihu_rlhf_3k](https://huggingface.co/datasets/liyucheng/zhihu_rlhf_3k) |
open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.5B-Cinder-Test-4 | ---
pretty_name: Evaluation run of Josephgflowers/Tinyllama-1.5B-Cinder-Test-4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Josephgflowers/Tinyllama-1.5B-Cinder-Test-4](https://huggingface.co/Josephgflowers/Tinyllama-1.5B-Cinder-Test-4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.5B-Cinder-Test-4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-06T17:24:27.759280](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.5B-Cinder-Test-4/blob/main/results_2024-04-06T17-24-27.759280.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.27217592186527517,\n\
\ \"acc_stderr\": 0.03144276024255498,\n \"acc_norm\": 0.2734738493888578,\n\
\ \"acc_norm_stderr\": 0.032278907489773266,\n \"mc1\": 0.25703794369645044,\n\
\ \"mc1_stderr\": 0.01529807750948508,\n \"mc2\": 0.41921478190992034,\n\
\ \"mc2_stderr\": 0.015367123176466086\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.31143344709897613,\n \"acc_stderr\": 0.013532472099850947,\n\
\ \"acc_norm\": 0.3242320819112628,\n \"acc_norm_stderr\": 0.01367881039951882\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4043019318860785,\n\
\ \"acc_stderr\": 0.004897534686686321,\n \"acc_norm\": 0.5204142601075483,\n\
\ \"acc_norm_stderr\": 0.0049856207736834295\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3111111111111111,\n\
\ \"acc_stderr\": 0.03999262876617721,\n \"acc_norm\": 0.3111111111111111,\n\
\ \"acc_norm_stderr\": 0.03999262876617721\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.035834961763610625,\n\
\ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.035834961763610625\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.31,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27547169811320754,\n \"acc_stderr\": 0.027495663683724057,\n\
\ \"acc_norm\": 0.27547169811320754,\n \"acc_norm_stderr\": 0.027495663683724057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.19444444444444445,\n\
\ \"acc_stderr\": 0.03309615177059006,\n \"acc_norm\": 0.19444444444444445,\n\
\ \"acc_norm_stderr\": 0.03309615177059006\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n\
\ \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n\
\ \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421296\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3191489361702128,\n \"acc_stderr\": 0.03047297336338004,\n\
\ \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.03047297336338004\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.037245636197746325,\n\
\ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.037245636197746325\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643895,\n \"\
acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643895\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.040735243221471255,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.040735243221471255\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2903225806451613,\n\
\ \"acc_stderr\": 0.02582210611941589,\n \"acc_norm\": 0.2903225806451613,\n\
\ \"acc_norm_stderr\": 0.02582210611941589\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2660098522167488,\n \"acc_stderr\": 0.031089826002937523,\n\
\ \"acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.031089826002937523\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3212121212121212,\n \"acc_stderr\": 0.03646204963253812,\n\
\ \"acc_norm\": 0.3212121212121212,\n \"acc_norm_stderr\": 0.03646204963253812\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.031911782267135466,\n \"\
acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.031911782267135466\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.029778663037752947,\n\
\ \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.029778663037752947\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.020752423722128002,\n\
\ \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.020752423722128002\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24369747899159663,\n \"acc_stderr\": 0.027886828078380572,\n\
\ \"acc_norm\": 0.24369747899159663,\n \"acc_norm_stderr\": 0.027886828078380572\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22935779816513763,\n \"acc_stderr\": 0.018025349724618684,\n \"\
acc_norm\": 0.22935779816513763,\n \"acc_norm_stderr\": 0.018025349724618684\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.30092592592592593,\n \"acc_stderr\": 0.03128039084329881,\n \"\
acc_norm\": 0.30092592592592593,\n \"acc_norm_stderr\": 0.03128039084329881\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.19117647058823528,\n \"acc_stderr\": 0.02759917430064077,\n \"\
acc_norm\": 0.19117647058823528,\n \"acc_norm_stderr\": 0.02759917430064077\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658342,\n \
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658342\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.29596412556053814,\n\
\ \"acc_stderr\": 0.030636591348699813,\n \"acc_norm\": 0.29596412556053814,\n\
\ \"acc_norm_stderr\": 0.030636591348699813\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.03880848301082395,\n\
\ \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.03880848301082395\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.34710743801652894,\n \"acc_stderr\": 0.04345724570292534,\n \"\
acc_norm\": 0.34710743801652894,\n \"acc_norm_stderr\": 0.04345724570292534\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.24074074074074073,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2621359223300971,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.2621359223300971,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2948717948717949,\n\
\ \"acc_stderr\": 0.029872577708891176,\n \"acc_norm\": 0.2948717948717949,\n\
\ \"acc_norm_stderr\": 0.029872577708891176\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.29118773946360155,\n\
\ \"acc_stderr\": 0.016246087069701404,\n \"acc_norm\": 0.29118773946360155,\n\
\ \"acc_norm_stderr\": 0.016246087069701404\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399202,\n\
\ \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399202\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26256983240223464,\n\
\ \"acc_stderr\": 0.014716824273017747,\n \"acc_norm\": 0.26256983240223464,\n\
\ \"acc_norm_stderr\": 0.014716824273017747\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3202614379084967,\n \"acc_stderr\": 0.026716118380156844,\n\
\ \"acc_norm\": 0.3202614379084967,\n \"acc_norm_stderr\": 0.026716118380156844\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.33440514469453375,\n\
\ \"acc_stderr\": 0.026795422327893947,\n \"acc_norm\": 0.33440514469453375,\n\
\ \"acc_norm_stderr\": 0.026795422327893947\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25177304964539005,\n \"acc_stderr\": 0.0258921511567094,\n \
\ \"acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.0258921511567094\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24185136897001303,\n\
\ \"acc_stderr\": 0.010936550813827061,\n \"acc_norm\": 0.24185136897001303,\n\
\ \"acc_norm_stderr\": 0.010936550813827061\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.16911764705882354,\n \"acc_stderr\": 0.022770868010113035,\n\
\ \"acc_norm\": 0.16911764705882354,\n \"acc_norm_stderr\": 0.022770868010113035\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.19090909090909092,\n \"acc_stderr\": 0.03764425585984926,\n\
\ \"acc_norm\": 0.19090909090909092,\n \"acc_norm_stderr\": 0.03764425585984926\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.17551020408163265,\n\
\ \"acc_stderr\": 0.024352800722970015,\n \"acc_norm\": 0.17551020408163265,\n\
\ \"acc_norm_stderr\": 0.024352800722970015\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.2537313432835821,\n \"acc_stderr\": 0.030769444967296018,\n\
\ \"acc_norm\": 0.2537313432835821,\n \"acc_norm_stderr\": 0.030769444967296018\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.35542168674698793,\n \"acc_stderr\": 0.03726214354322415,\n\
\ \"acc_norm\": 0.35542168674698793,\n \"acc_norm_stderr\": 0.03726214354322415\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.29239766081871343,\n\
\ \"acc_stderr\": 0.03488647713457922,\n \"acc_norm\": 0.29239766081871343,\n\
\ \"acc_norm_stderr\": 0.03488647713457922\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.25703794369645044,\n \"mc1_stderr\": 0.01529807750948508,\n\
\ \"mc2\": 0.41921478190992034,\n \"mc2_stderr\": 0.015367123176466086\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.5966850828729282,\n\
\ \"acc_stderr\": 0.013787257285896245\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Josephgflowers/Tinyllama-1.5B-Cinder-Test-4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|arc:challenge|25_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|gsm8k|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hellaswag|10_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T17-24-27.759280.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-06T17-24-27.759280.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- '**/details_harness|winogrande|5_2024-04-06T17-24-27.759280.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-06T17-24-27.759280.parquet'
- config_name: results
data_files:
- split: 2024_04_06T17_24_27.759280
path:
- results_2024-04-06T17-24-27.759280.parquet
- split: latest
path:
- results_2024-04-06T17-24-27.759280.parquet
---
# Dataset Card for Evaluation run of Josephgflowers/Tinyllama-1.5B-Cinder-Test-4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Josephgflowers/Tinyllama-1.5B-Cinder-Test-4](https://huggingface.co/Josephgflowers/Tinyllama-1.5B-Cinder-Test-4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.5B-Cinder-Test-4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-06T17:24:27.759280](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.5B-Cinder-Test-4/blob/main/results_2024-04-06T17-24-27.759280.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.27217592186527517,
"acc_stderr": 0.03144276024255498,
"acc_norm": 0.2734738493888578,
"acc_norm_stderr": 0.032278907489773266,
"mc1": 0.25703794369645044,
"mc1_stderr": 0.01529807750948508,
"mc2": 0.41921478190992034,
"mc2_stderr": 0.015367123176466086
},
"harness|arc:challenge|25": {
"acc": 0.31143344709897613,
"acc_stderr": 0.013532472099850947,
"acc_norm": 0.3242320819112628,
"acc_norm_stderr": 0.01367881039951882
},
"harness|hellaswag|10": {
"acc": 0.4043019318860785,
"acc_stderr": 0.004897534686686321,
"acc_norm": 0.5204142601075483,
"acc_norm_stderr": 0.0049856207736834295
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.03999262876617721,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.03999262876617721
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.035834961763610625,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.035834961763610625
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27547169811320754,
"acc_stderr": 0.027495663683724057,
"acc_norm": 0.27547169811320754,
"acc_norm_stderr": 0.027495663683724057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.19444444444444445,
"acc_stderr": 0.03309615177059006,
"acc_norm": 0.19444444444444445,
"acc_norm_stderr": 0.03309615177059006
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3191489361702128,
"acc_stderr": 0.03047297336338004,
"acc_norm": 0.3191489361702128,
"acc_norm_stderr": 0.03047297336338004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.037245636197746325,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.037245636197746325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25132275132275134,
"acc_stderr": 0.022340482339643895,
"acc_norm": 0.25132275132275134,
"acc_norm_stderr": 0.022340482339643895
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.040735243221471255,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.040735243221471255
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2903225806451613,
"acc_stderr": 0.02582210611941589,
"acc_norm": 0.2903225806451613,
"acc_norm_stderr": 0.02582210611941589
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.031089826002937523,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.031089826002937523
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3212121212121212,
"acc_stderr": 0.03646204963253812,
"acc_norm": 0.3212121212121212,
"acc_norm_stderr": 0.03646204963253812
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.031911782267135466,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.031911782267135466
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.029778663037752947,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.029778663037752947
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.020752423722128002,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.020752423722128002
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24369747899159663,
"acc_stderr": 0.027886828078380572,
"acc_norm": 0.24369747899159663,
"acc_norm_stderr": 0.027886828078380572
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22935779816513763,
"acc_stderr": 0.018025349724618684,
"acc_norm": 0.22935779816513763,
"acc_norm_stderr": 0.018025349724618684
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.30092592592592593,
"acc_stderr": 0.03128039084329881,
"acc_norm": 0.30092592592592593,
"acc_norm_stderr": 0.03128039084329881
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.19117647058823528,
"acc_stderr": 0.02759917430064077,
"acc_norm": 0.19117647058823528,
"acc_norm_stderr": 0.02759917430064077
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.028756799629658342,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.028756799629658342
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.29596412556053814,
"acc_stderr": 0.030636591348699813,
"acc_norm": 0.29596412556053814,
"acc_norm_stderr": 0.030636591348699813
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.26717557251908397,
"acc_stderr": 0.03880848301082395,
"acc_norm": 0.26717557251908397,
"acc_norm_stderr": 0.03880848301082395
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.34710743801652894,
"acc_stderr": 0.04345724570292534,
"acc_norm": 0.34710743801652894,
"acc_norm_stderr": 0.04345724570292534
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.2621359223300971,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.2621359223300971,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2948717948717949,
"acc_stderr": 0.029872577708891176,
"acc_norm": 0.2948717948717949,
"acc_norm_stderr": 0.029872577708891176
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.29118773946360155,
"acc_stderr": 0.016246087069701404,
"acc_norm": 0.29118773946360155,
"acc_norm_stderr": 0.016246087069701404
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24566473988439305,
"acc_stderr": 0.02317629820399202,
"acc_norm": 0.24566473988439305,
"acc_norm_stderr": 0.02317629820399202
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26256983240223464,
"acc_stderr": 0.014716824273017747,
"acc_norm": 0.26256983240223464,
"acc_norm_stderr": 0.014716824273017747
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3202614379084967,
"acc_stderr": 0.026716118380156844,
"acc_norm": 0.3202614379084967,
"acc_norm_stderr": 0.026716118380156844
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.33440514469453375,
"acc_stderr": 0.026795422327893947,
"acc_norm": 0.33440514469453375,
"acc_norm_stderr": 0.026795422327893947
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25177304964539005,
"acc_stderr": 0.0258921511567094,
"acc_norm": 0.25177304964539005,
"acc_norm_stderr": 0.0258921511567094
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24185136897001303,
"acc_stderr": 0.010936550813827061,
"acc_norm": 0.24185136897001303,
"acc_norm_stderr": 0.010936550813827061
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.16911764705882354,
"acc_stderr": 0.022770868010113035,
"acc_norm": 0.16911764705882354,
"acc_norm_stderr": 0.022770868010113035
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.19090909090909092,
"acc_stderr": 0.03764425585984926,
"acc_norm": 0.19090909090909092,
"acc_norm_stderr": 0.03764425585984926
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17551020408163265,
"acc_stderr": 0.024352800722970015,
"acc_norm": 0.17551020408163265,
"acc_norm_stderr": 0.024352800722970015
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2537313432835821,
"acc_stderr": 0.030769444967296018,
"acc_norm": 0.2537313432835821,
"acc_norm_stderr": 0.030769444967296018
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-virology|5": {
"acc": 0.35542168674698793,
"acc_stderr": 0.03726214354322415,
"acc_norm": 0.35542168674698793,
"acc_norm_stderr": 0.03726214354322415
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.03488647713457922,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.03488647713457922
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25703794369645044,
"mc1_stderr": 0.01529807750948508,
"mc2": 0.41921478190992034,
"mc2_stderr": 0.015367123176466086
},
"harness|winogrande|5": {
"acc": 0.5966850828729282,
"acc_stderr": 0.013787257285896245
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Marchanjo/spider-FIT-en-pt-es-fr-extra-3enr-3ptr-3esr-3frr | ---
license: cc-by-sa-4.0
---
Distributed under the Creative Commons-by-sa-4.0 respecting the ShareAlike of the [Spider Dataset](https://yale-lily.github.io/spider).
Code explanations and links for the model's checkpoints and datasets are on Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql)
Here is the [Hugging Face collection](https://huggingface.co/collections/Marchanjo/mrat-sql-65a671743bb0e70b416561f6), you can download the model's checkpoints and datasets, but to understand is better to go to Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
# mRAT-SQL-FIT
## A Multilingual Translator to SQL with Database Schema Pruning to Improve Self-Attention
Marcelo Archanjo Jose, Fabio Gagliardi Cozman
Long sequences of text are challenging in the context of transformers, due to quadratic memory increase in the self-attention mechanism. As this issue directly affects the translation from natural language to SQL queries (as techniques usually take as input a concatenated text with the question and the database schema), we present techniques that allow long text sequences to be handled by transformers with up to 512 input tokens. We propose a training process with database schema pruning (removal of tables and columns names that are useless for the query of interest). In addition, we used a multilingual approach with the mT5-large model fine-tuned with a data-augmented Spider dataset in four languages simultaneously: English, Portuguese, Spanish, and French. Our proposed technique used the Spider dataset and increased the exact set match accuracy results from 0.718 to 0.736 in a validation dataset (Dev). Source code, evaluations, and checkpoints are available at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
[paper published in Springer-Nature - International Journal of Information Technology](https://doi.org/10.1007/s41870-023-01342-3), [here the SharedIt link](https://rdcu.be/dff19). [here the pre-print in arXiv](https://arxiv.org/abs/2306.14256).
# mRAT-SQL+GAP
## mRAT-SQL+GAP:A Portuguese Text-to-SQL Transformer
Marcelo Archanjo José, Fabio Gagliardi Cozman
The translation of natural language questions to SQL queries has attracted growing attention, in particular in connection with transformers and similar language models. A large number of techniques are geared towards the English language; in this work, we thus investigated translation to SQL when input questions are given in the Portuguese language. To do so, we properly adapted state-of-the-art tools and resources. We changed the RAT-SQL+GAP system by relying on a multilingual BART model (we report tests with other language models), and we produced a translated version of the Spider dataset. Our experiments expose interesting phenomena that arise when non-English languages are targeted; in particular, it is better to train with original and translated training datasets together, even if a single target language is desired. This multilingual BART model fine-tuned with a double-size training dataset (English and Portuguese) achieved 83% of the baseline, making inferences for the Portuguese test dataset. This investigation can help other researchers to produce results in Machine Learning in a language different from English. Our multilingual ready version of RAT-SQL+GAP and the data are available, open-sourced as mRAT-SQL+GAP at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
BRACIS 2021: [paper published in Springer Lecture Notes in Computer Science](https://link.springer.com/chapter/10.1007%2F978-3-030-91699-2_35), [here the pre-print in arXiv](https://arxiv.org/abs/2110.03546).
Based on: RAT-SQL+GAP: [Github](https://github.com/awslabs/gap-text2sql). Paper: [AAAI 2021 paper](https://arxiv.org/abs/2012.10309) |
DeepSpiral/Reasearch-Paper-Test_382 | ---
license: gpl-3.0
---
|
Hello-SimpleAI/HC3-Chinese | ---
task_categories:
- text-classification
- question-answering
- sentence-similarity
- zero-shot-classification
language:
- en
- zh
tags:
- ChatGPT
- SimpleAI
- Detection
- OOD
size_categories:
- 10K<n<100K
license: cc-by-sa-4.0
---
# Human ChatGPT Comparison Corpus (HC3)
We propose the first human-ChatGPT comparison corpus, named **HC3** dataset.
This dataset is introduced in our paper:
- Paper: [***How Close is ChatGPT to Human Experts? Comparison Corpus, Evaluation, and Detection***](https://arxiv.org/abs/2301.07597)
Code, models and analysis are available on our GitHub:
- GitHub: [**Chatgpt-Comparison-Detection project** 🔬](https://github.com/Hello-SimpleAI/chatgpt-comparison-detection)
# Dataset Copyright
If the source datasets used in this corpus has a specific license which is stricter than CC-BY-SA, our products follow the same. If not, they follow CC-BY-SA license.
See [dataset copyright](https://github.com/Hello-SimpleAI/chatgpt-comparison-detection#dataset-copyright).
# Citation
Checkout this papaer [arxiv: 2301.07597](https://arxiv.org/abs/2301.07597)
```
@article{guo-etal-2023-hc3,
title = "How Close is ChatGPT to Human Experts? Comparison Corpus, Evaluation, and Detection",
author = "Guo, Biyang and
Zhang, Xin and
Wang, Ziyuan and
Jiang, Minqi and
Nie, Jinran and
Ding, Yuxuan and
Yue, Jianwei and
Wu, Yupeng",
journal={arXiv preprint arxiv:2301.07597}
year = "2023",
}
``` |
Vinnyyw/Dulceemaria | ---
license: openrail
---
|
hanho/test1 | ---
license: openrail
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: package_name
dtype: string
- name: review
dtype: string
- name: date
dtype: string
- name: star
dtype: int64
- name: version_id
dtype: int64
splits:
- name: train
num_bytes: 1508
num_examples: 5
- name: test
num_bytes: 956
num_examples: 5
download_size: 9453
dataset_size: 2464
---
|
CJWeiss/multitiny_rename | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input
sequence: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 784230112
num_examples: 1202
- name: test
num_bytes: 118646528
num_examples: 240
- name: valid
num_bytes: 116992208
num_examples: 161
download_size: 460324193
dataset_size: 1019868848
---
# Dataset Card for "multitiny_rename"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Carlosgg14/Ace | ---
license: openrail
---
|
yuiseki/onomatopoeia-ja-flat | ---
dataset_info:
features:
- name: text
dtype: string
- name: text_lang
dtype: string
- name: onomatopoeia_ja
dtype: string
splits:
- name: train
num_bytes: 646336
num_examples: 10346
download_size: 229643
dataset_size: 646336
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "onomatopoeia-ja-flat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hongji-s/as04-pairrm | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: winner
dtype: string
- name: loser
dtype: string
splits:
- name: train
num_bytes: 28191.30769230769
num_examples: 11
- name: test
num_bytes: 5125.692307692308
num_examples: 2
download_size: 59908
dataset_size: 33317.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_Fredithefish__Guanaco-3B-Uncensored | ---
pretty_name: Evaluation run of Fredithefish/Guanaco-3B-Uncensored
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Fredithefish/Guanaco-3B-Uncensored](https://huggingface.co/Fredithefish/Guanaco-3B-Uncensored)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Fredithefish__Guanaco-3B-Uncensored\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-17T08:31:46.700164](https://huggingface.co/datasets/open-llm-leaderboard/details_Fredithefish__Guanaco-3B-Uncensored/blob/main/results_2023-10-17T08-31-46.700164.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.004718959731543624,\n\
\ \"em_stderr\": 0.0007018360183131023,\n \"f1\": 0.0561975671140941,\n\
\ \"f1_stderr\": 0.001432842181402735,\n \"acc\": 0.3195438174264424,\n\
\ \"acc_stderr\": 0.007770725048768472\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.004718959731543624,\n \"em_stderr\": 0.0007018360183131023,\n\
\ \"f1\": 0.0561975671140941,\n \"f1_stderr\": 0.001432842181402735\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.00530705079605762,\n \
\ \"acc_stderr\": 0.0020013057209480444\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6337805840568271,\n \"acc_stderr\": 0.0135401443765889\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Fredithefish/Guanaco-3B-Uncensored
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|arc:challenge|25_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_17T08_31_46.700164
path:
- '**/details_harness|drop|3_2023-10-17T08-31-46.700164.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-17T08-31-46.700164.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_17T08_31_46.700164
path:
- '**/details_harness|gsm8k|5_2023-10-17T08-31-46.700164.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-17T08-31-46.700164.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hellaswag|10_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T07:19:31.389190.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_24T07_19_31.389190
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T07:19:31.389190.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T07:19:31.389190.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_17T08_31_46.700164
path:
- '**/details_harness|winogrande|5_2023-10-17T08-31-46.700164.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-17T08-31-46.700164.parquet'
- config_name: results
data_files:
- split: 2023_10_17T08_31_46.700164
path:
- results_2023-10-17T08-31-46.700164.parquet
- split: latest
path:
- results_2023-10-17T08-31-46.700164.parquet
---
# Dataset Card for Evaluation run of Fredithefish/Guanaco-3B-Uncensored
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Fredithefish/Guanaco-3B-Uncensored
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Fredithefish/Guanaco-3B-Uncensored](https://huggingface.co/Fredithefish/Guanaco-3B-Uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Fredithefish__Guanaco-3B-Uncensored",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T08:31:46.700164](https://huggingface.co/datasets/open-llm-leaderboard/details_Fredithefish__Guanaco-3B-Uncensored/blob/main/results_2023-10-17T08-31-46.700164.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.004718959731543624,
"em_stderr": 0.0007018360183131023,
"f1": 0.0561975671140941,
"f1_stderr": 0.001432842181402735,
"acc": 0.3195438174264424,
"acc_stderr": 0.007770725048768472
},
"harness|drop|3": {
"em": 0.004718959731543624,
"em_stderr": 0.0007018360183131023,
"f1": 0.0561975671140941,
"f1_stderr": 0.001432842181402735
},
"harness|gsm8k|5": {
"acc": 0.00530705079605762,
"acc_stderr": 0.0020013057209480444
},
"harness|winogrande|5": {
"acc": 0.6337805840568271,
"acc_stderr": 0.0135401443765889
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
autoevaluate/autoeval-staging-eval-project-6fbfec76-7855038 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- samsum
eval_info:
task: summarization
model: santiviquez/t5-small-finetuned-samsum-en
metrics: []
dataset_name: samsum
dataset_config: samsum
dataset_split: test
col_mapping:
text: dialogue
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: santiviquez/t5-small-finetuned-samsum-en
* Dataset: samsum
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
open-llm-leaderboard/details_quantumaikr__quantum-v0.01 | ---
pretty_name: Evaluation run of quantumaikr/quantum-v0.01
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [quantumaikr/quantum-v0.01](https://huggingface.co/quantumaikr/quantum-v0.01)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_quantumaikr__quantum-v0.01\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-10T15:38:18.408039](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__quantum-v0.01/blob/main/results_2024-01-10T15-38-18.408039.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6572995040063592,\n\
\ \"acc_stderr\": 0.03199963347273244,\n \"acc_norm\": 0.6571345413469432,\n\
\ \"acc_norm_stderr\": 0.032660707489366475,\n \"mc1\": 0.5495716034271726,\n\
\ \"mc1_stderr\": 0.01741726437196764,\n \"mc2\": 0.6927526472916785,\n\
\ \"mc2_stderr\": 0.015028880570718646\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6936860068259386,\n \"acc_stderr\": 0.013470584417276513,\n\
\ \"acc_norm\": 0.7252559726962458,\n \"acc_norm_stderr\": 0.013044617212771227\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7102170882294364,\n\
\ \"acc_stderr\": 0.004527343651130798,\n \"acc_norm\": 0.882692690699064,\n\
\ \"acc_norm_stderr\": 0.0032112847607016636\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720386,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720386\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n\
\ \"acc_stderr\": 0.035149425512674394,\n \"acc_norm\": 0.6936416184971098,\n\
\ \"acc_norm_stderr\": 0.035149425512674394\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.04959859966384181,\n\
\ \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.04959859966384181\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923996,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923996\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524565,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524565\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8550458715596331,\n \"acc_stderr\": 0.01509421569970048,\n \"\
acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.01509421569970048\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"\
acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.031024411740572213,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.031024411740572213\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8244274809160306,\n \"acc_stderr\": 0.03336820338476074,\n\
\ \"acc_norm\": 0.8244274809160306,\n \"acc_norm_stderr\": 0.03336820338476074\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8365261813537676,\n\
\ \"acc_stderr\": 0.013223928616741622,\n \"acc_norm\": 0.8365261813537676,\n\
\ \"acc_norm_stderr\": 0.013223928616741622\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4782122905027933,\n\
\ \"acc_stderr\": 0.016706617522176132,\n \"acc_norm\": 0.4782122905027933,\n\
\ \"acc_norm_stderr\": 0.016706617522176132\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826528,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826528\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5495716034271726,\n\
\ \"mc1_stderr\": 0.01741726437196764,\n \"mc2\": 0.6927526472916785,\n\
\ \"mc2_stderr\": 0.015028880570718646\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8255722178374112,\n \"acc_stderr\": 0.010665187902498428\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7028051554207733,\n \
\ \"acc_stderr\": 0.012588685966624179\n }\n}\n```"
repo_url: https://huggingface.co/quantumaikr/quantum-v0.01
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|arc:challenge|25_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|gsm8k|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hellaswag|10_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T15-38-18.408039.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T15-38-18.408039.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- '**/details_harness|winogrande|5_2024-01-10T15-38-18.408039.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-10T15-38-18.408039.parquet'
- config_name: results
data_files:
- split: 2024_01_10T15_38_18.408039
path:
- results_2024-01-10T15-38-18.408039.parquet
- split: latest
path:
- results_2024-01-10T15-38-18.408039.parquet
---
# Dataset Card for Evaluation run of quantumaikr/quantum-v0.01
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [quantumaikr/quantum-v0.01](https://huggingface.co/quantumaikr/quantum-v0.01) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_quantumaikr__quantum-v0.01",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-10T15:38:18.408039](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__quantum-v0.01/blob/main/results_2024-01-10T15-38-18.408039.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6572995040063592,
"acc_stderr": 0.03199963347273244,
"acc_norm": 0.6571345413469432,
"acc_norm_stderr": 0.032660707489366475,
"mc1": 0.5495716034271726,
"mc1_stderr": 0.01741726437196764,
"mc2": 0.6927526472916785,
"mc2_stderr": 0.015028880570718646
},
"harness|arc:challenge|25": {
"acc": 0.6936860068259386,
"acc_stderr": 0.013470584417276513,
"acc_norm": 0.7252559726962458,
"acc_norm_stderr": 0.013044617212771227
},
"harness|hellaswag|10": {
"acc": 0.7102170882294364,
"acc_stderr": 0.004527343651130798,
"acc_norm": 0.882692690699064,
"acc_norm_stderr": 0.0032112847607016636
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720386,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720386
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.035149425512674394,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.035149425512674394
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.04959859966384181,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.04959859966384181
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923996,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923996
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524565,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524565
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8550458715596331,
"acc_stderr": 0.01509421569970048,
"acc_norm": 0.8550458715596331,
"acc_norm_stderr": 0.01509421569970048
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.031024411740572213,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.031024411740572213
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8244274809160306,
"acc_stderr": 0.03336820338476074,
"acc_norm": 0.8244274809160306,
"acc_norm_stderr": 0.03336820338476074
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8365261813537676,
"acc_stderr": 0.013223928616741622,
"acc_norm": 0.8365261813537676,
"acc_norm_stderr": 0.013223928616741622
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069356,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069356
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4782122905027933,
"acc_stderr": 0.016706617522176132,
"acc_norm": 0.4782122905027933,
"acc_norm_stderr": 0.016706617522176132
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826528,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826528
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5495716034271726,
"mc1_stderr": 0.01741726437196764,
"mc2": 0.6927526472916785,
"mc2_stderr": 0.015028880570718646
},
"harness|winogrande|5": {
"acc": 0.8255722178374112,
"acc_stderr": 0.010665187902498428
},
"harness|gsm8k|5": {
"acc": 0.7028051554207733,
"acc_stderr": 0.012588685966624179
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Astirstroh/Sinatra | ---
license: unlicense
---
|
yanickschraner/swiss_parliament_corpus | ---
dataset_info:
features:
- name: client_id
dtype: int64
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
- name: up_votes
dtype: float64
- name: down_votes
dtype: float64
- name: age
dtype: float64
- name: gender
dtype: float64
- name: accent
dtype: float64
- name: iou_estimate
dtype: float64
splits:
- name: train
num_bytes: 24373100536.732
num_examples: 90324
- name: test
num_bytes: 824083440.94
num_examples: 3332
download_size: 14083003405
dataset_size: 25197183977.671997
---
# Dataset Card for "swiss_parliament_corpus"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Back-up/chung-khoan-demo | ---
dataset_info:
features:
- name: url
dtype: string
- name: title
dtype: string
- name: date
dtype: string
- name: view
struct:
- name: number_of_response
dtype: string
- name: number_of_view
dtype: string
- name: content
list:
- name: res
dtype: string
splits:
- name: train
num_bytes: 47433728
num_examples: 9257
download_size: 16956338
dataset_size: 47433728
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
satyaalmasian/narrativeqa_subset | ---
dataset_info:
features:
- name: document
struct:
- name: id
dtype: string
- name: kind
dtype: string
- name: url
dtype: string
- name: file_size
dtype: int32
- name: word_count
dtype: int32
- name: start
dtype: string
- name: end
dtype: string
- name: summary
struct:
- name: text
dtype: string
- name: tokens
sequence: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: question
struct:
- name: text
dtype: string
- name: tokens
sequence: string
- name: answers
list:
- name: text
dtype: string
- name: tokens
sequence: string
splits:
- name: train
num_bytes: 110879812
num_examples: 317
download_size: 2272692
dataset_size: 110879812
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
JJini/honsol | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 6682255
num_examples: 12880
download_size: 1056734
dataset_size: 6682255
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/charybdis_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of charybdis/カリブディス/卡律布狄斯 (Azur Lane)
This is the dataset of charybdis/カリブディス/卡律布狄斯 (Azur Lane), containing 110 images and their tags.
The core tags of this character are `breasts, long_hair, large_breasts, grey_hair, grey_eyes, bangs, maid_headdress, huge_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 110 | 193.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/charybdis_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 110 | 92.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/charybdis_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 265 | 200.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/charybdis_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 110 | 163.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/charybdis_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 265 | 310.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/charybdis_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/charybdis_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 27 |  |  |  |  |  | 1girl, cleavage, solo, looking_at_viewer, official_alternate_costume, dress, thighhighs, sitting, blush, feather_boa, mimikaki, thighs, covered_navel, smile, holding, black_hair |
| 1 | 40 |  |  |  |  |  | 1girl, looking_at_viewer, white_gloves, solo, underboob_cutout, maid, white_apron, elbow_gloves, simple_background, black_dress, frills, sleeveless, blush, white_background, waist_apron, bare_shoulders |
| 2 | 13 |  |  |  |  |  | 1girl, blush, solo, completely_nude, open_mouth, nipples, black_hair, head_out_of_frame, heart, navel, sweat, very_long_hair, night, thighs, wet |
| 3 | 6 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, sun_hat, bare_shoulders, cleavage, outdoors, hat_flower, thighs, white_headwear, blue_sky, brown_eyes, closed_mouth, collarbone, day, hair_between_eyes, parted_lips, sitting, white_dress |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | solo | looking_at_viewer | official_alternate_costume | dress | thighhighs | sitting | blush | feather_boa | mimikaki | thighs | covered_navel | smile | holding | black_hair | white_gloves | underboob_cutout | maid | white_apron | elbow_gloves | simple_background | black_dress | frills | sleeveless | white_background | waist_apron | bare_shoulders | completely_nude | open_mouth | nipples | head_out_of_frame | heart | navel | sweat | very_long_hair | night | wet | sun_hat | outdoors | hat_flower | white_headwear | blue_sky | brown_eyes | closed_mouth | collarbone | day | hair_between_eyes | parted_lips | white_dress |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:-------|:--------------------|:-----------------------------|:--------|:-------------|:----------|:--------|:--------------|:-----------|:---------|:----------------|:--------|:----------|:-------------|:---------------|:-------------------|:-------|:--------------|:---------------|:--------------------|:--------------|:---------|:-------------|:-------------------|:--------------|:-----------------|:------------------|:-------------|:----------|:--------------------|:--------|:--------|:--------|:-----------------|:--------|:------|:----------|:-----------|:-------------|:-----------------|:-----------|:-------------|:---------------|:-------------|:------|:--------------------|:--------------|:--------------|
| 0 | 27 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 40 |  |  |  |  |  | X | | X | X | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 13 |  |  |  |  |  | X | | X | | | | | | X | | | X | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | X | X | | | | X | | | | X | | X | | | | | | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
LambdaTests/VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_3_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: response
dtype: string
splits:
- name: train
num_bytes: 1037
num_examples: 32
download_size: 2262
dataset_size: 1037
---
# Dataset Card for "VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_3_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
benchang1110/multiturn_chat_0.8m-chinese-zhtw | ---
dataset_info:
features:
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 969083052
num_examples: 831036
download_size: 561072214
dataset_size: 969083052
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "multiturn_chat_0.8m-chinese-zhtw"
## 內容
包含約 80 萬條由 [BELLE](https://github.com/LianjiaTech/BELLE) 專案所產生的 *user* 與 *assistant* 的多輪對話。
注意:此資料集是由 ChatGPT 產生的,未經嚴格校驗,內容可能包含錯誤。使用過程中請注意這一點。
## 限制和使用限制
我們要求開發者僅將我們開源的程式碼、資料、模型及後續衍生物用於研究目的,不得用於商業,以及其他會對社會帶來危害的用途。
由於數據是由*ChatGPT*產生的,未經嚴格驗證,在事實性和其他方面仍有一些不足之處。因此,在使用此資料集時,請務必注意甄別。
本資料集不代表任何一方的立場、利益或想法,無關任何團體的任何類型的主張。因使用本資料集帶來的任何損害、糾紛,本專案的開發者不承擔任何責任。
***
# Multiturn Chat 0.8M
## Contents
Includes approx. 0.8M Chinese multiturn dialogs between *human* and *assistant*.
Note: this subset was generated by *ChatGPT* and was not strictly verified. The dialog contents might contain errors. Please take this in mind when using this subset.
**instruction** contains history dialog context, distinguishable by *Human:* and *Assistant:*, **output** contains the current reply by *assistant*.
|
AlanYky/big-bench-list-function-turing | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 200221
num_examples: 835
download_size: 69165
dataset_size: 200221
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
celloscopeai/bangla_ner_dataset | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
splits:
- name: train
num_bytes: 2107132
num_examples: 5252
- name: validation
num_bytes: 522332
num_examples: 1314
download_size: 569072
dataset_size: 2629464
---
# Dataset Card for "bangla_ner_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ruanchaves/hatebr_por_Latn_to_spa_Latn | ---
dataset_info:
features:
- name: instagram_comments
dtype: string
- name: offensive_language
dtype: bool
- name: offensiveness_levels
dtype: int32
- name: antisemitism
dtype: bool
- name: apology_for_the_dictatorship
dtype: bool
- name: fatphobia
dtype: bool
- name: homophobia
dtype: bool
- name: partyism
dtype: bool
- name: racism
dtype: bool
- name: religious_intolerance
dtype: bool
- name: sexism
dtype: bool
- name: xenophobia
dtype: bool
- name: offensive_&_non-hate_speech
dtype: bool
- name: non-offensive
dtype: bool
- name: specialist_1_hate_speech
dtype: bool
- name: specialist_2_hate_speech
dtype: bool
- name: specialist_3_hate_speech
dtype: bool
splits:
- name: train
num_bytes: 426153
num_examples: 4480
- name: validation
num_bytes: 94951
num_examples: 1120
- name: test
num_bytes: 120538
num_examples: 1400
download_size: 0
dataset_size: 641642
---
# Dataset Card for "hatebr_por_Latn_to_spa_Latn"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sarpba/OAAST1_hu_madlad_3b | ---
dataset_info:
features:
- name: message_id
dtype: string
- name: parent_id
dtype: string
- name: user_id
dtype: string
- name: created_date
dtype: string
- name: text
dtype: string
- name: role
dtype: string
- name: lang
dtype: string
- name: review_count
dtype: int64
- name: review_result
dtype: bool
- name: deleted
dtype: bool
- name: rank
dtype: float64
- name: synthetic
dtype: bool
- name: model_name
dtype: 'null'
- name: detoxify
struct:
- name: identity_attack
dtype: float64
- name: insult
dtype: float64
- name: obscene
dtype: float64
- name: severe_toxicity
dtype: float64
- name: sexual_explicit
dtype: float64
- name: threat
dtype: float64
- name: toxicity
dtype: float64
- name: message_tree_id
dtype: string
- name: tree_state
dtype: string
- name: emojis
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: labels
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: value
sequence: float64
splits:
- name: validation
num_bytes: 3989719
num_examples: 3246
- name: train
num_bytes: 96522107
num_examples: 80877
download_size: 36444526
dataset_size: 100511826
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: train
path: data/train-*
---
|
yadheedhya/autotrain-data-wiki-sum | ---
language:
- en
task_categories:
- summarization
---
# AutoTrain Dataset for project: wiki-sum
## Dataset Description
This dataset has been automatically processed by AutoTrain for project wiki-sum.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "(CNN) -- Shelling hit areas near two key cities in eastern Ukraine on Sunday morning, intensifying fears that a ceasefire that took effect less than two days ago may be falling apart. Why is the ceasefire under strain? A variety of fighting factions in the conflict zone -- on both sides -- may not fall directly under a military chain of command. The pro-Russian rebels are mostly volunteer militias; fighting against them on the Ukrainian side are at least some far-right nationalist militias. Controlling these groups is difficult and some may have different aims, including sabotaging the truce. At this point it's been nearly impossible to figure out who's doing the firing and why. The conditions of the ceasefire agreement don't help either. The conditions are vague and at this point there doesn't seem to be an effective mechanism in place inside the conflict zone to monitor and enforce the agreement. Why can't the two sides' leaders control their forces? It's unclear if Kiev has control over all of the fighting forces in eastern Ukraine. Some of the volunteer militias fighting alongside Ukrainian soldiers are far-right nationalists who've been critical of the current government in Kiev, but they're still fighting because they feel Ukraine is under attack by Russia. And who controls the pro-Russian rebels? Is it the local commanders? Is it Russian President Vladimir Putin? None of that is clear. Which side has the most to gain from the truce? If the truce leads to good-faith negotiations and a compromise, then both sides can gain. A compromise could look like something like this: The pro-Russian region of Donbas gets autonomy and self-determination under a federalized Ukrainian government, and in return the rebels drop their demand for independence and Kiev gets to protect Ukraine's sovereignty and territorial integrity. There are elements on both sides that don't want a compromise, and they could certainly have the potential of undermining the truce. What happens next? We wait to see if the overnight shellings and firings are an anomaly or if they're a sign of more violence and more fighting. If the ceasefire sticks, both sides have agreed to hold talks that will address the core issues and demands on both sides that are still unresolved -- including the disarming of the rebels, a guarantee of self-determination for the pro-Russian Donbas region, the fate of Russian-annexed Crimea, constitutional reform, and a solution to the humanitarian crisis in the conflict zone. Obviously if the fighting continues, all bets are off. What effect does this have on the rest of the world? What happens in the coming days will determine the next move by NATO and Western leaders. If the ceasefire falls apart, the West will likely turn up the pressure by following through with sanctions and bolstering Western forces in NATO's Baltic-member states. Moscow has already threatened to respond if that happens. The bottom line is, the conflict will escalate and so will the prospects of a regional conflict -- although at this point that seems unlikely.",
"target": "Shelling in eastern Ukraine raises fears that a ceasefire may fall apart .\nReza Sayah: It's been nearly impossible to say who's doing firing and why .\nIt's unclear who has control over fighting forces on either side in east Ukraine .\nThere are elements on both sides that don't want a compromise .",
"feat_id": "6d779ca2ed0bf4ce6f08bbd5e5d7223f854cc44d"
},
{
"text": "Killer instincts? Lacey Spears, 26, is pictured with her 5-year-old son Garnett (right) who she's now accused of killing . A New York State mom blogger who gained a widespread following with posts about her son's near-constant health problems could soon be charged in the 5-year-old's suspicious January death. Authorities in Westchester County almost immediately looked to Lacey Spears when her son Garnett died and have since determined the boy died from acute sodium poisoning. Now, a source close to the investigation has revealed that the 26-year-old southern transplant could soon find herself in a New York jail as authorities prepare to press charges in what they've deemed a homicide. 'It's evident by the nature of what was found in his body that somebody, in effect, poisoned him,' the source told FoxNews.com. 'He died at the hands of somebody else.' Should Spears be charged, it will be the culmination of months of speculation over her involvement in the death, the state of her mental health and of her possible motivations. Spears could potentially get charged with anything from murder to negligent homicide or manslaughter. Since her son's death, Spears has fled New York, where the Alabama native lived in a 'fellowship community' of back-to-the-land types, for Kentucky. Her attorney David Sachs denied his client had anything to do with her son's death. 'Lacey is completely devastated by the loss of her son and absolutely denies harming her son in any way,' Sachs told FoxNews.com . Agonizing: The boy died from acute sodium intoxication days after he was rushed to a Westchester County, New York hospital complaining of severe stomach pains . Garnett Spears died at Westchester . Medical Center in January after Spears rushed him to the emergency room . on suffering intense abdominal pains from what appeared to be a stomach . virus. The little boy's condition worsened and he died four days later with what appeared to be dangerously high amounts of sodium in his system - leading investigators to question whether Garnett was deliberately given the salt. Indeed, before he tragically passed away, hospital officials told police that the levels were suspiciously high - which caused the investigation to be opened into the Chestnut Ridge mother. Following this, police sought and obtained a search warrant for her home in mid-January. Suspicious: A neighbor claims Spears asked that he hide one of the many feeding tube bags that she used--unnecessarily many say--to feed Garnett as he lay dying in January . They took food from her home, her cell phone and computer. They have also questioned Lacey Spears' friends and family and obtained Garnett's past medical records after discovering that he had been hospitalized 23-times during his short life. As the investigation continued, a neighbor who asked not to be named told USA Today that as Spears' son lay dying she asked her to go to her home and dispose of one of the boy's feeding bags - which allegedly contained a large amount of sodium. Police have not released why Garnett . had a feeding tube - believed to be in his abdomen - but have pointed to . his history of illness and repeated hospital visits. The . neighbor who asked not to be named said that they did initially remove . the bag - but then phoned the police and turned it over when they heard . about the circumstances surrounding the boys death. Those circumstances, it became clear all too late, were nothing new. The . same questionable parenting and suggestions of the child-harming mental . illness Munchhausen by proxy surrounded Garnett and his mom starting . from just after the boy was born and across the states of Alabama, . Florida and New York. The . Florida Department of Children and Families revealed in April that . they've had a file on Spears since 2011, when an anonymous call voicing . concerns about her parenting was made to their abuse hotline. And . prior to that, in 2009 while the family lived in Alabama, Garnett . suffered severe seizures and had to be flown by helicopter to a Decatur . hospital and resuscitated. The . Journal News reported last month that the Florida agency is sharing all . information they have on the mother with New York authorities. Sad: Garnett was hospitalized more than 23 times during his five years of life and his mother Lacey updated social media to update her followers on her sons progress . Munchausen by proxy, is a psychiatric disorder which makes a parent purposely hurt their child \u2014 to get attention. 'It is so counter-intuitive to all our ideas of what parenthood is supposed to be,' said Dr. Marc D. Feldman who has written extensively about Munchausen syndrome. 'But medical child abuse can and does occur.' Feldmen told USA Today that this case has at least four distinct red-flags for him that indicate the possibility of abuse. 1) Garnett had a feeding tube for unknown reasons. In case of Munchausen by proxy the child will be sick enough to have a feeding tube or intravenous line that gives the abuser access to the body internally. 'That's an avenue for medical chaos,' Feldman said. 'A feeding tube is a real red flag.' 2) It is almost physically impossible to ingest or force feed a lethal does of salt. However, introducing it over time through a feeding tube is possible. 3) Feldman points out that those with Munchausen often exhibit pseudologia fantastica \u2014 compulsive lying - Spears told lies about her partner and about being a mother . 4) Exaggerated stories of their child being ill is another symptom - and in this case Spears obsessively detailed her child's illnesses online . People with the condition also have borderline personality disorder, a psychiatric condition marked by problems with impulsive and reckless behavior, leading to unstable relationships. Signs of Munchausen by proxy can include: .",
"target": "Suspicion has rested almost solely on Lacey Spears since the January death of her little boy Garnett .\nA source now says authorities in Westchester County, New York have deemed the boy's death a homicide by sodium poisoning .\nSpears could now face charges ranging from murder to manslaughter or negligent homicide .\nSpears allegedly asked a neighbor to dispose of one of the many stomach feeding bags she claimed her son needed as he lay dying .",
"feat_id": "4408ae99dd26219a3803b597158fa652095840d9"
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"target": "Value(dtype='string', id=None)",
"feat_id": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 229690 |
| valid | 57423 |
|
GEM-submissions/ratishsp__ncp_cc__1649422112 | ---
benchmark: gem
type: prediction
submission_name: NCP_CC
tags:
- evaluation
- benchmark
---
# GEM Submission
Submission name: NCP_CC
|
distilled-from-one-sec-cv12/chunk_139 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1139129512
num_examples: 221966
download_size: 1163716370
dataset_size: 1139129512
---
# Dataset Card for "chunk_139"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
andrewsiah/anthropic_hh_sft | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 13042104
num_examples: 20159
download_size: 7382066
dataset_size: 13042104
---
# Dataset Card for "anthropic_hh_sft"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DamarJati/IND-number-plate | ---
task_categories:
- text-classification
size_categories:
- n<1K
---
Original datasets: https://www.kaggle.com/datasets/firqaaa/indonesian-vehicle-plate-numbers |
NEUDM/semeval-2016 | ---
language:
- en
---
> 上述数据集为ABSA(Aspect-Based Sentiment Analysis)领域数据集,基本形式为从句子中抽取:方面术语、方面类别(术语类别)、术语在上下文中情感极性以及针对该术语的观点词,不同数据集抽取不同的信息,这点在jsonl文件的“instruction”键中有分别提到,在此我将其改造为了生成任务,需要模型按照一定格式生成抽取结果。
#### 以acos数据集中抽取的jsonl文件一条数据举例:
```
{
"task_type": "generation",
"dataset": "acos",
"input": ["the computer has difficulty switching between tablet and computer ."],
"output": "[['computer', 'laptop usability', 'negative', 'difficulty']]",
"situation": "none",
"label": "",
"extra": "",
"instruction": "
Task: Extracting aspect terms and their corresponding aspect categories, sentiment polarities, and opinion words.
Input: A sentence
Output: A list of 4-tuples, where each tuple contains the extracted aspect term, its aspect category, sentiment polarity, and opinion words (if any). Supplement: \"Null\" means that there is no occurrence in the sentence.
Example:
Sentence: \"Also it's not a true SSD drive in there but eMMC, which makes a difference.\"
Output: [['SSD drive', 'hard_disc operation_performance', 'negative', 'NULL']]'
"
}
```
> 此处未设置label和extra,在instruction中以如上所示的字符串模板,并给出一个例子进行one-shot,ABSA领域数据集(absa-quad,acos,arts,aste-data-v2,mams,semeval-2014,semeval-2015,semeval-2016,towe)每个数据集对应instruction模板相同,内容有细微不同,且部分数据集存在同一数据集不同数据instruction内容不同的情况。
#### 原始数据集
- 数据[链接](https://alt.qcri.org/semeval2016/task5/)
- Paper:[SemEval-2016 Task 5: Aspect Based Sentiment Analysis](https://aclanthology.org/S16-1002/)
- 说明:数据分为Laptop和restaurant两个主题的数据,分别在两个文件夹中放置。两个主题的数据抽取的元素不同。
#### 当前SOTA
*数据来自[PaperWithCode](https://paperswithcode.com/sota)*
- SemEval2016-Laptop
未调研到相关评测工作
- [SemEval2016-Restaurant](https://paperswithcode.com/sota/aspect-based-sentiment-analysis-on-semeval-2)
- 评价指标:Accuracy(抽取的分类准确率)
- 模型:BERT-IL Finetuned (**88.70**)
- Paper:[Does BERT Understand Sentiment? Leveraging Comparisons Between Contextual and Non-Contextual Embeddings to Improve Aspect-Based Sentiment Models](https://paperswithcode.com/paper/does-bert-understand-sentiment-leveraging)
- 信息来源:[SemEval-2016](https://paperswithcode.com/sota/aspect-based-sentiment-analysis-on-semeval-2) |
open-llm-leaderboard/details_openchat__openchat-3.5-1210 | ---
pretty_name: Evaluation run of openchat/openchat-3.5-1210
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [openchat/openchat-3.5-1210](https://huggingface.co/openchat/openchat-3.5-1210)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openchat__openchat-3.5-1210\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-18T08:20:01.389363](https://huggingface.co/datasets/open-llm-leaderboard/details_openchat__openchat-3.5-1210/blob/main/results_2023-12-18T08-20-01.389363.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6488027943708179,\n\
\ \"acc_stderr\": 0.03210806718394323,\n \"acc_norm\": 0.6497098585915643,\n\
\ \"acc_norm_stderr\": 0.03276295616957028,\n \"mc1\": 0.35128518971848227,\n\
\ \"mc1_stderr\": 0.016711358163544403,\n \"mc2\": 0.5214604840999885,\n\
\ \"mc2_stderr\": 0.015414031217543209\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6109215017064846,\n \"acc_stderr\": 0.014247309976045607,\n\
\ \"acc_norm\": 0.6493174061433447,\n \"acc_norm_stderr\": 0.013944635930726097\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.664708225453097,\n\
\ \"acc_stderr\": 0.004711275408138422,\n \"acc_norm\": 0.8492332204740092,\n\
\ \"acc_norm_stderr\": 0.0035709011883580674\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.67,\n\
\ \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n \
\ \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.025107425481137282,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.025107425481137282\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356853,\n \"\
acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356853\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338641,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338641\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644234,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644234\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253262,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253262\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.02983796238829194,\n \
\ \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.02983796238829194\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590167,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590167\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553353,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553353\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.037601780060266196,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.037601780060266196\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281382,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281382\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n\
\ \"acc_stderr\": 0.013927751372001503,\n \"acc_norm\": 0.8135376756066411,\n\
\ \"acc_norm_stderr\": 0.013927751372001503\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.02386800326250011,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.02386800326250011\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3553072625698324,\n\
\ \"acc_stderr\": 0.01600698993480318,\n \"acc_norm\": 0.3553072625698324,\n\
\ \"acc_norm_stderr\": 0.01600698993480318\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.02500646975579921,\n\
\ \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.02500646975579921\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4771838331160365,\n\
\ \"acc_stderr\": 0.0127569333828237,\n \"acc_norm\": 0.4771838331160365,\n\
\ \"acc_norm_stderr\": 0.0127569333828237\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7169117647058824,\n \"acc_stderr\": 0.027365861131513812,\n\
\ \"acc_norm\": 0.7169117647058824,\n \"acc_norm_stderr\": 0.027365861131513812\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \
\ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.028920583220675606,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.028920583220675606\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061452,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061452\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35128518971848227,\n\
\ \"mc1_stderr\": 0.016711358163544403,\n \"mc2\": 0.5214604840999885,\n\
\ \"mc2_stderr\": 0.015414031217543209\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8074191002367798,\n \"acc_stderr\": 0.011082538847491897\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6595905989385898,\n \
\ \"acc_stderr\": 0.013052097103299104\n }\n}\n```"
repo_url: https://huggingface.co/openchat/openchat-3.5-1210
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|arc:challenge|25_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|gsm8k|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hellaswag|10_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T08-20-01.389363.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-18T08-20-01.389363.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- '**/details_harness|winogrande|5_2023-12-18T08-20-01.389363.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-18T08-20-01.389363.parquet'
- config_name: results
data_files:
- split: 2023_12_18T08_20_01.389363
path:
- results_2023-12-18T08-20-01.389363.parquet
- split: latest
path:
- results_2023-12-18T08-20-01.389363.parquet
---
# Dataset Card for Evaluation run of openchat/openchat-3.5-1210
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [openchat/openchat-3.5-1210](https://huggingface.co/openchat/openchat-3.5-1210) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openchat__openchat-3.5-1210",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-18T08:20:01.389363](https://huggingface.co/datasets/open-llm-leaderboard/details_openchat__openchat-3.5-1210/blob/main/results_2023-12-18T08-20-01.389363.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6488027943708179,
"acc_stderr": 0.03210806718394323,
"acc_norm": 0.6497098585915643,
"acc_norm_stderr": 0.03276295616957028,
"mc1": 0.35128518971848227,
"mc1_stderr": 0.016711358163544403,
"mc2": 0.5214604840999885,
"mc2_stderr": 0.015414031217543209
},
"harness|arc:challenge|25": {
"acc": 0.6109215017064846,
"acc_stderr": 0.014247309976045607,
"acc_norm": 0.6493174061433447,
"acc_norm_stderr": 0.013944635930726097
},
"harness|hellaswag|10": {
"acc": 0.664708225453097,
"acc_stderr": 0.004711275408138422,
"acc_norm": 0.8492332204740092,
"acc_norm_stderr": 0.0035709011883580674
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.025107425481137282,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.025107425481137282
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338641,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338641
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644234,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644234
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253262,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253262
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.02983796238829194,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.02983796238829194
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590167,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590167
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553353,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553353
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601446,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.037601780060266196,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.037601780060266196
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281382,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281382
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001503,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001503
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.02386800326250011,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.02386800326250011
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3553072625698324,
"acc_stderr": 0.01600698993480318,
"acc_norm": 0.3553072625698324,
"acc_norm_stderr": 0.01600698993480318
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.02500646975579921,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.02500646975579921
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4771838331160365,
"acc_stderr": 0.0127569333828237,
"acc_norm": 0.4771838331160365,
"acc_norm_stderr": 0.0127569333828237
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7169117647058824,
"acc_stderr": 0.027365861131513812,
"acc_norm": 0.7169117647058824,
"acc_norm_stderr": 0.027365861131513812
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.01904748523936038,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.01904748523936038
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.028920583220675606,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.028920583220675606
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061452,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061452
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35128518971848227,
"mc1_stderr": 0.016711358163544403,
"mc2": 0.5214604840999885,
"mc2_stderr": 0.015414031217543209
},
"harness|winogrande|5": {
"acc": 0.8074191002367798,
"acc_stderr": 0.011082538847491897
},
"harness|gsm8k|5": {
"acc": 0.6595905989385898,
"acc_stderr": 0.013052097103299104
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_dfurman__GarrulusMarcoro-7B-v0.1 | ---
pretty_name: Evaluation run of dfurman/GarrulusMarcoro-7B-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [dfurman/GarrulusMarcoro-7B-v0.1](https://huggingface.co/dfurman/GarrulusMarcoro-7B-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dfurman__GarrulusMarcoro-7B-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-11T07:25:38.981116](https://huggingface.co/datasets/open-llm-leaderboard/details_dfurman__GarrulusMarcoro-7B-v0.1/blob/main/results_2024-01-11T07-25-38.981116.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.65235582933386,\n\
\ \"acc_stderr\": 0.032031502829239805,\n \"acc_norm\": 0.6517234590534214,\n\
\ \"acc_norm_stderr\": 0.032709809296815245,\n \"mc1\": 0.5324357405140759,\n\
\ \"mc1_stderr\": 0.01746663214957761,\n \"mc2\": 0.6705208734039925,\n\
\ \"mc2_stderr\": 0.0153587467278896\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6979522184300341,\n \"acc_stderr\": 0.013417519144716417,\n\
\ \"acc_norm\": 0.7235494880546075,\n \"acc_norm_stderr\": 0.013069662474252423\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.715893248356901,\n\
\ \"acc_stderr\": 0.004500662294697923,\n \"acc_norm\": 0.8800039832702649,\n\
\ \"acc_norm_stderr\": 0.0032429275808698575\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.041539484047423976,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.041539484047423976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n\
\ \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \
\ \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754406,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754406\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.02390491431178265,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.02390491431178265\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669235,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669235\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n\
\ \"acc_stderr\": 0.013468201614066306,\n \"acc_norm\": 0.8288633461047255,\n\
\ \"acc_norm_stderr\": 0.013468201614066306\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n\
\ \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43910614525139663,\n\
\ \"acc_stderr\": 0.01659802212058043,\n \"acc_norm\": 0.43910614525139663,\n\
\ \"acc_norm_stderr\": 0.01659802212058043\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.02447722285613511,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.02447722285613511\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n\
\ \"acc_stderr\": 0.01274307294265335,\n \"acc_norm\": 0.46740547588005216,\n\
\ \"acc_norm_stderr\": 0.01274307294265335\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.0286619962023353,\n\
\ \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.0286619962023353\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507208,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507208\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.0282638899437846,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.0282638899437846\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5324357405140759,\n\
\ \"mc1_stderr\": 0.01746663214957761,\n \"mc2\": 0.6705208734039925,\n\
\ \"mc2_stderr\": 0.0153587467278896\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8721389108129439,\n \"acc_stderr\": 0.009385235583937267\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6595905989385898,\n \
\ \"acc_stderr\": 0.013052097103299097\n }\n}\n```"
repo_url: https://huggingface.co/dfurman/GarrulusMarcoro-7B-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|arc:challenge|25_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|gsm8k|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hellaswag|10_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T07-25-38.981116.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-11T07-25-38.981116.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- '**/details_harness|winogrande|5_2024-01-11T07-25-38.981116.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-11T07-25-38.981116.parquet'
- config_name: results
data_files:
- split: 2024_01_11T07_25_38.981116
path:
- results_2024-01-11T07-25-38.981116.parquet
- split: latest
path:
- results_2024-01-11T07-25-38.981116.parquet
---
# Dataset Card for Evaluation run of dfurman/GarrulusMarcoro-7B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [dfurman/GarrulusMarcoro-7B-v0.1](https://huggingface.co/dfurman/GarrulusMarcoro-7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dfurman__GarrulusMarcoro-7B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-11T07:25:38.981116](https://huggingface.co/datasets/open-llm-leaderboard/details_dfurman__GarrulusMarcoro-7B-v0.1/blob/main/results_2024-01-11T07-25-38.981116.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.65235582933386,
"acc_stderr": 0.032031502829239805,
"acc_norm": 0.6517234590534214,
"acc_norm_stderr": 0.032709809296815245,
"mc1": 0.5324357405140759,
"mc1_stderr": 0.01746663214957761,
"mc2": 0.6705208734039925,
"mc2_stderr": 0.0153587467278896
},
"harness|arc:challenge|25": {
"acc": 0.6979522184300341,
"acc_stderr": 0.013417519144716417,
"acc_norm": 0.7235494880546075,
"acc_norm_stderr": 0.013069662474252423
},
"harness|hellaswag|10": {
"acc": 0.715893248356901,
"acc_stderr": 0.004500662294697923,
"acc_norm": 0.8800039832702649,
"acc_norm_stderr": 0.0032429275808698575
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.041539484047423976,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.041539484047423976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754406,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754406
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.049135952012744975,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.049135952012744975
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.02548718714785938,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.02548718714785938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.02390491431178265,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.02390491431178265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.03038835355188679,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.03038835355188679
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669235,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669235
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066306,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066306
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.02335736578587403,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.02335736578587403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43910614525139663,
"acc_stderr": 0.01659802212058043,
"acc_norm": 0.43910614525139663,
"acc_norm_stderr": 0.01659802212058043
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.02447722285613511,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.02447722285613511
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.01274307294265335,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.01274307294265335
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.0286619962023353,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.0286619962023353
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507208,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507208
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.0282638899437846,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.0282638899437846
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5324357405140759,
"mc1_stderr": 0.01746663214957761,
"mc2": 0.6705208734039925,
"mc2_stderr": 0.0153587467278896
},
"harness|winogrande|5": {
"acc": 0.8721389108129439,
"acc_stderr": 0.009385235583937267
},
"harness|gsm8k|5": {
"acc": 0.6595905989385898,
"acc_stderr": 0.013052097103299097
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mxronga/airoboros-yo | ---
license: apache-2.0
language:
- yo
- en
--- |
thercyl/JPM | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: float64
- name: Ticker
dtype: string
- name: Year
dtype: string
- name: Text
dtype: string
- name: Embedding
dtype: string
splits:
- name: train
num_bytes: 74251017
num_examples: 2138
download_size: 42409697
dataset_size: 74251017
---
# Dataset Card for "JPM"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jacobbieker/asos-1min | ---
license: mit
---
|
arthurdubrou/Bird_simple_corrections | ---
license: apache-2.0
---
|
AiresPucrs/stopwords-en | ---
dataset_info:
features:
- name: stopwords
dtype: string
splits:
- name: train
num_bytes: 1894
num_examples: 220
download_size: 2286
dataset_size: 1894
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- en
pretty_name: stopwords-en
size_categories:
- n<1K
license: apache-2.0
task_categories:
- text-classification
---
# stopwords-en
## Overview
The stopword-en dataset contains a stopword list of frequently used in the English language.
These words do not carry significant meaning and are often removed from text data during preprocessing and training in shallower models
on a text classification task.
## Dataset Details
```
- Dataset Name: stopwords-en
- Total Size: 220 demonstrations
```
## Contents
The dataset consists of one column with strings like all the letters of the Roman alphabet, numbers from 1 to 10,
and words frequently used in the English language, such as "day", "days", "know", "went", "like", etc.
## How to use
```python
from sklearn.feature_extraction.text import TfidfVectorizer
# Download the English stopword list.
stopwords = load_dataset('AiresPucrs/stopwords-en', split='train')['stopwords']
# Create a vectorization object via `TfidfVectorizer`
vectorizer = TfidfVectorizer(min_df=10,
max_features=100000,
analyzer='word',
ngram_range=(1, 2),
stop_words=stopwords, # Our list of stopwords.
lowercase=True)
# Fit the TfidfVectorizer to our dataset.
vectorizer.fit(dataset['text'])
```
## License
This dataset is licensed under the Apache License, version 2.0. |
Shiveswarran/llm_code_description_v5 | ---
license: apache-2.0
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_105 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 970581028.0
num_examples: 190609
download_size: 991148111
dataset_size: 970581028.0
---
# Dataset Card for "chunk_105"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kevinwang676/will_dataset | ---
license: mit
---
|
gustawdaniel/ngram-google-2012 | ---
license: cc-by-3.0
---
```
python -m spacy download en_core_web_sm
```
Titles:
```
jq -s '.[].title' raw/dict.jsonl
```
returns
- [x] "English"
- [ ] "English One Million"
- [x] "American English"
- [x] "British English"
- [x] "English Fiction"
- [ ] "Chinese (simplified)"
- [x] "French"
- [x] "German"
- [ ] "Hebrew"
- [ ] "Italian"
- [x] "Russian"
- [x] "Spanish"
Spellcheck:
https://pypi.org/project/pyspellchecker/
```
English - ‘en’
Spanish - ‘es’
French - ‘fr’
Portuguese - ‘pt’
German - ‘de’
Russian - ‘ru’
Arabic - ‘ar’
```
Sets now:
- [x] "English" - en
- [x] "Spanish" - es
- [x] "French" - fr
- [x] "German" - de
- [x] "Russian" - ru
|
mask-distilled-one-sec-cv12/chunk_243 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1024444204
num_examples: 201187
download_size: 1043913059
dataset_size: 1024444204
---
# Dataset Card for "chunk_243"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_KnutJaegersberg__megatron-GPT-2-345m-EvolInstruct | ---
pretty_name: Evaluation run of KnutJaegersberg/megatron-GPT-2-345m-EvolInstruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KnutJaegersberg/megatron-GPT-2-345m-EvolInstruct](https://huggingface.co/KnutJaegersberg/megatron-GPT-2-345m-EvolInstruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__megatron-GPT-2-345m-EvolInstruct\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T21:16:44.386502](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__megatron-GPT-2-345m-EvolInstruct/blob/main/results_2023-09-22T21-16-44.386502.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0032508389261744967,\n\
\ \"em_stderr\": 0.000582948670855896,\n \"f1\": 0.04389052013422833,\n\
\ \"f1_stderr\": 0.0012654910642229172,\n \"acc\": 0.27577067125904975,\n\
\ \"acc_stderr\": 0.007840478478378102\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0032508389261744967,\n \"em_stderr\": 0.000582948670855896,\n\
\ \"f1\": 0.04389052013422833,\n \"f1_stderr\": 0.0012654910642229172\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0037907505686125853,\n \
\ \"acc_stderr\": 0.0016927007401501884\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5477505919494869,\n \"acc_stderr\": 0.013988256216606017\n\
\ }\n}\n```"
repo_url: https://huggingface.co/KnutJaegersberg/megatron-GPT-2-345m-EvolInstruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|arc:challenge|25_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T21_16_44.386502
path:
- '**/details_harness|drop|3_2023-09-22T21-16-44.386502.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T21-16-44.386502.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T21_16_44.386502
path:
- '**/details_harness|gsm8k|5_2023-09-22T21-16-44.386502.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T21-16-44.386502.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hellaswag|10_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:09:55.167974.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T14:09:55.167974.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T14:09:55.167974.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T21_16_44.386502
path:
- '**/details_harness|winogrande|5_2023-09-22T21-16-44.386502.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T21-16-44.386502.parquet'
- config_name: results
data_files:
- split: 2023_07_19T14_09_55.167974
path:
- results_2023-07-19T14:09:55.167974.parquet
- split: 2023_09_22T21_16_44.386502
path:
- results_2023-09-22T21-16-44.386502.parquet
- split: latest
path:
- results_2023-09-22T21-16-44.386502.parquet
---
# Dataset Card for Evaluation run of KnutJaegersberg/megatron-GPT-2-345m-EvolInstruct
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KnutJaegersberg/megatron-GPT-2-345m-EvolInstruct
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KnutJaegersberg/megatron-GPT-2-345m-EvolInstruct](https://huggingface.co/KnutJaegersberg/megatron-GPT-2-345m-EvolInstruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__megatron-GPT-2-345m-EvolInstruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T21:16:44.386502](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__megatron-GPT-2-345m-EvolInstruct/blob/main/results_2023-09-22T21-16-44.386502.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0032508389261744967,
"em_stderr": 0.000582948670855896,
"f1": 0.04389052013422833,
"f1_stderr": 0.0012654910642229172,
"acc": 0.27577067125904975,
"acc_stderr": 0.007840478478378102
},
"harness|drop|3": {
"em": 0.0032508389261744967,
"em_stderr": 0.000582948670855896,
"f1": 0.04389052013422833,
"f1_stderr": 0.0012654910642229172
},
"harness|gsm8k|5": {
"acc": 0.0037907505686125853,
"acc_stderr": 0.0016927007401501884
},
"harness|winogrande|5": {
"acc": 0.5477505919494869,
"acc_stderr": 0.013988256216606017
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Nexdata/Chinese_Mandarin_Synthesis_Data_Female_Customer_Service | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Chinese_Mandarin_Synthesis_Data_Female_Customer_Service
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/1098?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
26.1 Hours - Chinese Mandarin Synthesis Corpus-Female, Customer Service, It is recorded by Chinese native speakers, with lively and frindly voice. The phoneme coverage is balanced. Professional phonetician participates in the annotation. It precisely matches with the research and development needs of the speech synthesis.
For more details, please refer to the link: https://www.nexdata.ai/datasets/1098?source=Huggingface
### Supported Tasks and Leaderboards
tts: The dataset can be used to train a model for Text to Speech (TTS).
### Languages
Chinese Mandarin
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions
|
pminervini/shroom | ---
license: mit
dataset_info:
features:
- name: hyp
dtype: string
- name: ref
dtype: string
- name: src
dtype: string
- name: tgt
dtype: string
- name: model
dtype: string
- name: labels
sequence: string
- name: label
dtype: string
- name: p(Hallucination)
dtype: float64
splits:
- name: DM
num_bytes: 77485
num_examples: 187
- name: PG
num_bytes: 29874
num_examples: 125
- name: MT
num_bytes: 53749
num_examples: 187
download_size: 79829
dataset_size: 161108
configs:
- config_name: default
data_files:
- split: DM
path: data/DM-*
- split: PG
path: data/PG-*
- split: MT
path: data/MT-*
---
|
open-llm-leaderboard/details_Telugu-LLM-Labs__Indic-gemma-7b-finetuned-sft-Navarasa-2.0 | ---
pretty_name: Evaluation run of Telugu-LLM-Labs/Indic-gemma-7b-finetuned-sft-Navarasa-2.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Telugu-LLM-Labs/Indic-gemma-7b-finetuned-sft-Navarasa-2.0](https://huggingface.co/Telugu-LLM-Labs/Indic-gemma-7b-finetuned-sft-Navarasa-2.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Telugu-LLM-Labs__Indic-gemma-7b-finetuned-sft-Navarasa-2.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-22T17:31:24.848459](https://huggingface.co/datasets/open-llm-leaderboard/details_Telugu-LLM-Labs__Indic-gemma-7b-finetuned-sft-Navarasa-2.0/blob/main/results_2024-03-22T17-31-24.848459.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.540078202110863,\n\
\ \"acc_stderr\": 0.033879824791246044,\n \"acc_norm\": 0.5449910711013712,\n\
\ \"acc_norm_stderr\": 0.03457989471408299,\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.016419874731135032,\n \"mc2\": 0.49588787997553246,\n\
\ \"mc2_stderr\": 0.015429495608726489\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.515358361774744,\n \"acc_stderr\": 0.014604496129394908,\n\
\ \"acc_norm\": 0.5460750853242321,\n \"acc_norm_stderr\": 0.01454922110517187\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.54690300736905,\n \
\ \"acc_stderr\": 0.0049677789400119346,\n \"acc_norm\": 0.7434773949412468,\n\
\ \"acc_norm_stderr\": 0.0043582106894422675\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779205,\n\
\ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779205\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5584905660377358,\n \"acc_stderr\": 0.030561590426731837,\n\
\ \"acc_norm\": 0.5584905660377358,\n \"acc_norm_stderr\": 0.030561590426731837\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.041227287076512825,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.041227287076512825\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\"\
: 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.03265019475033581,\n\
\ \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.03265019475033581\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159795,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159795\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6451612903225806,\n\
\ \"acc_stderr\": 0.027218889773308757,\n \"acc_norm\": 0.6451612903225806,\n\
\ \"acc_norm_stderr\": 0.027218889773308757\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3645320197044335,\n \"acc_stderr\": 0.0338640574606209,\n\
\ \"acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.0338640574606209\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.0350143870629678,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.0350143870629678\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270286,\n \"\
acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270286\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7616580310880829,\n \"acc_stderr\": 0.030748905363909868,\n\
\ \"acc_norm\": 0.7616580310880829,\n \"acc_norm_stderr\": 0.030748905363909868\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4897435897435897,\n \"acc_stderr\": 0.025345672221942374,\n\
\ \"acc_norm\": 0.4897435897435897,\n \"acc_norm_stderr\": 0.025345672221942374\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145668,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145668\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4579831932773109,\n \"acc_stderr\": 0.03236361111951941,\n \
\ \"acc_norm\": 0.4579831932773109,\n \"acc_norm_stderr\": 0.03236361111951941\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.726605504587156,\n\
\ \"acc_stderr\": 0.0191092998460983,\n \"acc_norm\": 0.726605504587156,\n\
\ \"acc_norm_stderr\": 0.0191092998460983\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.39814814814814814,\n \"acc_stderr\": 0.033384734032074016,\n\
\ \"acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6568627450980392,\n \"acc_stderr\": 0.033321399446680854,\n \"\
acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.033321399446680854\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460305,\n \
\ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460305\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n\
\ \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.600896860986547,\n\
\ \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n\
\ \"acc_stderr\": 0.045879047413018105,\n \"acc_norm\": 0.6574074074074074,\n\
\ \"acc_norm_stderr\": 0.045879047413018105\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.03680350371286461,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.03680350371286461\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.0445325483632647,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.0445325483632647\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n\
\ \"acc_stderr\": 0.02685345037700915,\n \"acc_norm\": 0.7863247863247863,\n\
\ \"acc_norm_stderr\": 0.02685345037700915\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7522349936143039,\n\
\ \"acc_stderr\": 0.01543808308056897,\n \"acc_norm\": 0.7522349936143039,\n\
\ \"acc_norm_stderr\": 0.01543808308056897\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5520231213872833,\n \"acc_stderr\": 0.02677299065336182,\n\
\ \"acc_norm\": 0.5520231213872833,\n \"acc_norm_stderr\": 0.02677299065336182\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2659217877094972,\n\
\ \"acc_stderr\": 0.014776765066438885,\n \"acc_norm\": 0.2659217877094972,\n\
\ \"acc_norm_stderr\": 0.014776765066438885\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.02799672318063145,\n\
\ \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.02799672318063145\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5466237942122186,\n\
\ \"acc_stderr\": 0.028274359854894238,\n \"acc_norm\": 0.5466237942122186,\n\
\ \"acc_norm_stderr\": 0.028274359854894238\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6080246913580247,\n \"acc_stderr\": 0.02716368603827115,\n\
\ \"acc_norm\": 0.6080246913580247,\n \"acc_norm_stderr\": 0.02716368603827115\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.39361702127659576,\n \"acc_stderr\": 0.029144544781596154,\n \
\ \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.029144544781596154\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3624511082138201,\n\
\ \"acc_stderr\": 0.012277512533252481,\n \"acc_norm\": 0.3624511082138201,\n\
\ \"acc_norm_stderr\": 0.012277512533252481\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4742647058823529,\n \"acc_stderr\": 0.030332578094555033,\n\
\ \"acc_norm\": 0.4742647058823529,\n \"acc_norm_stderr\": 0.030332578094555033\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4624183006535948,\n \"acc_stderr\": 0.02017061497496977,\n \
\ \"acc_norm\": 0.4624183006535948,\n \"acc_norm_stderr\": 0.02017061497496977\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.030862144921087558,\n\
\ \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.030862144921087558\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338734,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338734\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.016419874731135032,\n \"mc2\": 0.49588787997553246,\n\
\ \"mc2_stderr\": 0.015429495608726489\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6961325966850829,\n \"acc_stderr\": 0.012926209475483574\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3214556482183472,\n \
\ \"acc_stderr\": 0.012864471384836705\n }\n}\n```"
repo_url: https://huggingface.co/Telugu-LLM-Labs/Indic-gemma-7b-finetuned-sft-Navarasa-2.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|arc:challenge|25_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|gsm8k|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hellaswag|10_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T17-31-24.848459.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T17-31-24.848459.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- '**/details_harness|winogrande|5_2024-03-22T17-31-24.848459.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-22T17-31-24.848459.parquet'
- config_name: results
data_files:
- split: 2024_03_22T17_31_24.848459
path:
- results_2024-03-22T17-31-24.848459.parquet
- split: latest
path:
- results_2024-03-22T17-31-24.848459.parquet
---
# Dataset Card for Evaluation run of Telugu-LLM-Labs/Indic-gemma-7b-finetuned-sft-Navarasa-2.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Telugu-LLM-Labs/Indic-gemma-7b-finetuned-sft-Navarasa-2.0](https://huggingface.co/Telugu-LLM-Labs/Indic-gemma-7b-finetuned-sft-Navarasa-2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Telugu-LLM-Labs__Indic-gemma-7b-finetuned-sft-Navarasa-2.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-22T17:31:24.848459](https://huggingface.co/datasets/open-llm-leaderboard/details_Telugu-LLM-Labs__Indic-gemma-7b-finetuned-sft-Navarasa-2.0/blob/main/results_2024-03-22T17-31-24.848459.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.540078202110863,
"acc_stderr": 0.033879824791246044,
"acc_norm": 0.5449910711013712,
"acc_norm_stderr": 0.03457989471408299,
"mc1": 0.3268053855569155,
"mc1_stderr": 0.016419874731135032,
"mc2": 0.49588787997553246,
"mc2_stderr": 0.015429495608726489
},
"harness|arc:challenge|25": {
"acc": 0.515358361774744,
"acc_stderr": 0.014604496129394908,
"acc_norm": 0.5460750853242321,
"acc_norm_stderr": 0.01454922110517187
},
"harness|hellaswag|10": {
"acc": 0.54690300736905,
"acc_stderr": 0.0049677789400119346,
"acc_norm": 0.7434773949412468,
"acc_norm_stderr": 0.0043582106894422675
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779205,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779205
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5584905660377358,
"acc_stderr": 0.030561590426731837,
"acc_norm": 0.5584905660377358,
"acc_norm_stderr": 0.030561590426731837
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.041227287076512825,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.041227287076512825
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.03265019475033581,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.03265019475033581
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159795,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159795
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6451612903225806,
"acc_stderr": 0.027218889773308757,
"acc_norm": 0.6451612903225806,
"acc_norm_stderr": 0.027218889773308757
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.0338640574606209,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.0338640574606209
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.0350143870629678,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.0350143870629678
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.03154449888270286,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.03154449888270286
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7616580310880829,
"acc_stderr": 0.030748905363909868,
"acc_norm": 0.7616580310880829,
"acc_norm_stderr": 0.030748905363909868
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4897435897435897,
"acc_stderr": 0.025345672221942374,
"acc_norm": 0.4897435897435897,
"acc_norm_stderr": 0.025345672221942374
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145668,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145668
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4579831932773109,
"acc_stderr": 0.03236361111951941,
"acc_norm": 0.4579831932773109,
"acc_norm_stderr": 0.03236361111951941
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.726605504587156,
"acc_stderr": 0.0191092998460983,
"acc_norm": 0.726605504587156,
"acc_norm_stderr": 0.0191092998460983
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.033321399446680854,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.033321399446680854
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.028458820991460305,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.028458820991460305
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.600896860986547,
"acc_stderr": 0.03286745312567961,
"acc_norm": 0.600896860986547,
"acc_norm_stderr": 0.03286745312567961
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.045879047413018105,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.045879047413018105
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.03680350371286461,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.03680350371286461
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.0445325483632647,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.0445325483632647
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7863247863247863,
"acc_stderr": 0.02685345037700915,
"acc_norm": 0.7863247863247863,
"acc_norm_stderr": 0.02685345037700915
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7522349936143039,
"acc_stderr": 0.01543808308056897,
"acc_norm": 0.7522349936143039,
"acc_norm_stderr": 0.01543808308056897
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5520231213872833,
"acc_stderr": 0.02677299065336182,
"acc_norm": 0.5520231213872833,
"acc_norm_stderr": 0.02677299065336182
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2659217877094972,
"acc_stderr": 0.014776765066438885,
"acc_norm": 0.2659217877094972,
"acc_norm_stderr": 0.014776765066438885
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.02799672318063145,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.02799672318063145
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5466237942122186,
"acc_stderr": 0.028274359854894238,
"acc_norm": 0.5466237942122186,
"acc_norm_stderr": 0.028274359854894238
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6080246913580247,
"acc_stderr": 0.02716368603827115,
"acc_norm": 0.6080246913580247,
"acc_norm_stderr": 0.02716368603827115
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.39361702127659576,
"acc_stderr": 0.029144544781596154,
"acc_norm": 0.39361702127659576,
"acc_norm_stderr": 0.029144544781596154
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3624511082138201,
"acc_stderr": 0.012277512533252481,
"acc_norm": 0.3624511082138201,
"acc_norm_stderr": 0.012277512533252481
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4742647058823529,
"acc_stderr": 0.030332578094555033,
"acc_norm": 0.4742647058823529,
"acc_norm_stderr": 0.030332578094555033
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4624183006535948,
"acc_stderr": 0.02017061497496977,
"acc_norm": 0.4624183006535948,
"acc_norm_stderr": 0.02017061497496977
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.030862144921087558,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.030862144921087558
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7810945273631841,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.7810945273631841,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338734,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338734
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3268053855569155,
"mc1_stderr": 0.016419874731135032,
"mc2": 0.49588787997553246,
"mc2_stderr": 0.015429495608726489
},
"harness|winogrande|5": {
"acc": 0.6961325966850829,
"acc_stderr": 0.012926209475483574
},
"harness|gsm8k|5": {
"acc": 0.3214556482183472,
"acc_stderr": 0.012864471384836705
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
WinterSchool/MideficsDataset | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: conversation
struct:
- name: data
list:
- name: answer
dtype: string
- name: question
dtype: string
splits:
- name: train
num_bytes: 2132989003.9490128
num_examples: 3800
- name: test
num_bytes: 112823892.05098726
num_examples: 201
download_size: 2244437082
dataset_size: 2245812896
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
task_categories:
- question-answering
- visual-question-answering
language:
- en
tags:
- medical
- image
- image-to-text
pretty_name: Midefics conversational dataset
size_categories:
- 1K<n<10K
---
MideficsDataset is a dataset of conversations on radiology and skin cancer images. The dataset is intended to be used for training and testing Medical Visual Question Answering (VQA) systems.
The dataset is built from [MURA](https://arxiv.org/abs/1712.06957), [ISIC](https://www.isic-archive.com/) and [ROCO](https://www.semanticscholar.org/paper/Radiology-Objects-in-COntext-(ROCO)%3A-A-Multimodal-Pelka-Koitka/a564fabf130ff6e2742cfba90c7a4018937d764d) which are free open-access online datasets of medical images.
The conversations were generated using GPT-3.5-turbo based on metadata associated to each image.
|
tanmaylaud/scidcc-instructions | ---
license: openrail
task_categories:
- conversational
- summarization
- question-answering
language:
- en
tags:
- climate
- instruction
pretty_name: scidcc-instr
size_categories:
- 1K<n<10K
source_datasets:
- scidcc
---
### Dataset Summary
Instruction-Response pairs generated using the SciDCC Climate Dataset from [Climabench](https://huggingface.co/datasets/iceberg-nlp/climabench)
### Format
```
### Instruction:
Present a fitting title for the provided text.
For those who study earthquakes, one major challenge has been trying to understand all the physics of a fault -- both during an earthquake and at times of "rest" -- in order to know more about how a particular region may behave in the future. Now, researchers at the California Institute of Technology (Caltech) have developed the first computer model of an earthquake-producing fault segment that reproduces, in a single physical framework, the available observations of both the fault's seismic (fast) and aseismic (slow) behavior.
### Response:
Greater insight into earthquake cycles
```
#### 4 distinct instruction-response pairs per scidcc article. (Refer to columns instruction1, instruction2, instruction3, instruction4)
|
EgilKarlsen/CSIC_DistilRoBERTa_FT | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 115621182
num_examples: 37500
- name: test
num_bytes: 38540387
num_examples: 12500
download_size: 211876775
dataset_size: 154161569
---
# Dataset Card for "CSIC_DistilRoBERTa_FT"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FSDL-Fashion/dummy_swin_pipe_5k | ---
dataset_info:
features:
- name: path
dtype: string
- name: embedding
sequence: float32
splits:
- name: train
num_bytes: 20800000
num_examples: 5000
download_size: 21312459
dataset_size: 20800000
---
# Dataset Card for "dummy_swin_pipe_5k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/Food101_test_google_flan_t5_xl_mode_T_SPECIFIC_A_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0__Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_clip_tags_LAION_ViT_H_14_2B_simple_specific_rices
num_bytes: 392695
num_examples: 1000
download_size: 49126
dataset_size: 392695
---
# Dataset Card for "Food101_test_google_flan_t5_xl_mode_T_SPECIFIC_A_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/paimon_genshin | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Paimon/パイモン/페이몬/派蒙 (Genshin Impact)
This is the dataset of Paimon/パイモン/페이몬/派蒙 (Genshin Impact), containing 500 images and their tags.
The core tags of this character are `white_hair, halo, hair_ornament, hair_between_eyes, blue_eyes, short_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 739.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/paimon_genshin/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 640.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/paimon_genshin/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1203 | 1.27 GiB | [Download](https://huggingface.co/datasets/CyberHarem/paimon_genshin/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/paimon_genshin',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, cape, long_sleeves, looking_at_viewer, scarf, solo, white_dress, smile, blush, closed_mouth, open_mouth |
| 1 | 5 |  |  |  |  |  | 1girl, cape, long_sleeves, looking_at_viewer, open_mouth, scarf, single_thighhigh, solo, upper_teeth_only, white_dress, :d, blush, white_background, white_thighhighs, simple_background |
| 2 | 5 |  |  |  |  |  | 1girl, cape, full_body, long_sleeves, looking_at_viewer, open_mouth, scarf, single_thighhigh, solo, white_dress, white_footwear, white_thighhighs, :d, thighhighs_under_boots, upper_teeth_only, white_background, blush, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cape | long_sleeves | looking_at_viewer | scarf | solo | white_dress | smile | blush | closed_mouth | open_mouth | single_thighhigh | upper_teeth_only | :d | white_background | white_thighhighs | simple_background | full_body | white_footwear | thighhighs_under_boots |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------------|:--------------------|:--------|:-------|:--------------|:--------|:--------|:---------------|:-------------|:-------------------|:-------------------|:-----|:-------------------|:-------------------|:--------------------|:------------|:-----------------|:-------------------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | | X | | X | X | X | X | X | X | X | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | | X | | X | X | X | X | X | X | X | X | X | X |
|
anan-2024/twitter_dataset_1713053378 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 20284
num_examples: 45
download_size: 11202
dataset_size: 20284
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
davanstrien/fuego-20230308-091454-1f2ec4 | ---
tags:
- fuego
fuego:
id: 20230308-091454-1f2ec4
status: preparing
script: script
requirements_file: requirements.txt
space_id: davanstrien/fuego-20230308-091454-1f2ec4
space_hardware: cpu-basic
---
|
ADSASDASDASDASDASD/wesley | ---
license: openrail
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.