datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
kaleemWaheed/twitter_dataset_1713009571 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 42416
num_examples: 99
download_size: 16911
dataset_size: 42416
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
imageomics/Curated_GoldStandard_Hoyal_Cuthill | ---
license: cc0-1.0
task_categories:
- image-classification
tags:
- butterfly
- heliconius erato
- heliconius melpomene
- full body
- dorsal
- RGB
- bird view
- butterfly view
- bird acuity
- butterfly acuity
- imbalanced
- mimicry
- cv
pretty_name: Curated Gold Standard Hoyal Cuthill Dataset
size_categories:
- n<1K
---
# Dataset Card for Curated Gold Standard Hoyal Cuthill Dataset
## Dataset Description
Dorsal full body images of subspecies of _Heliconius erato_ and _Heliconius melpomene_ (18 subspecies total).
There are 960 images with 320 specimens (3 images of each specimen: Original/ Bird transformed/ Butterfly transformed)
The original images are low-resolution RGB photographs (photographs were "cropped and resized to a height of 64 pixels (maintaining the original image aspect ratio and padded to 140 pixels wide)"(Hoyal Cuthill et al., 2019)).
These low-resolution images were then transformed using [AcuityView](https://cran.r-project.org/web/packages/AcuityView/index.html) with estimates of acuity from [AcuityView 2.0](http://www.empiricalimaging.com/knowledge-base/acuityview/) and [(Land, 1997)](https://www.annualreviews.org/doi/10.1146/annurev.ento.42.1.147).
Users should know that since (Land, 1997), more recent estimates of insect visual acuity have been created.
This data represents a subset of images processed from Hoyal Cuthill et al. dataset available at [doi:10.5061/dryad.2hp1978](https://doi.org/10.5061/dryad.2hp1978). Their original dataset also includes ventral images.
**Note:** `dorsal_images_cuthill` contains processed dorsal images from the original Hoyal Cuthill dataset (all 1,234 specimens).
- **Homepage:**
- **Repository:** [Butterfly-mimicry](https://github.com/Imageomics/Butterfly-mimicry) contains research done using this dataset.
- **Paper:** [Imageomics Approach to Understanding Visual Biological Trait Similarities using Butterfly Mimicry as a Model System](http://rave.ohiolink.edu/etdc/view?acc_num=osu168198420667979)
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary

<!---
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
--->
### Supported Tasks and Leaderboards
_Heliconius erato_ and _Heliconius melpomene_ subspecies identification (image classification), with variable settings for acuity of the observer (bird, butterfly, or human/other).
### Languages
English
## Dataset Structure
```
|-- dorsal_images_cuthill
| |
| |-- 10427965_D_lowres.tif
| |
| |-- 10427966_D_lowres.tif
| |
| | ...
|
|-- Acuities
| |
| |-- train_bird
| | |
| | |-- erato_cyrbia
| | |
| | |-- erato_etylus
| | |
| | | ...
| |
| |-- test_bird
| | |
| | |-- erato_cyrbia
| | |
| | |-- erato_etylus
| | |
| | | ...
| |
| |-- train_butterfly
| | |
| | |-- erato_cyrbia
| | |
| | |-- erato_etylus
| | |
| | | ...
| |
| |-- test_butterfly
| |
| |-- erato_cyrbia
| |
| |-- erato_etylus
| |
| | ...
|
|-- train
| |
| |-- erato_cyrbia
| |
| |-- erato_etylus
| |
| | ...
|
|-- test
|
|-- erato_cyrbia
|
|-- erato_etylus
|
| ...
```
### Data Instances
* Type: PNG
* Size: 128px x 128px
* Background: [210, 210, 210] (gray)
* Fit in frame: Most padding is above and below the image, some on the left and right.
* Ruler or Scale: None
* Color Reference (ColorChecker, white-balance, None): None
### Data Fields
**In `Hoyal_Cuthill_GoldStandard_metadata_cleaned.csv`:**
* `NHM_Specimen`: Natural History Museum Specimen number
* `Image_filename`: filename of image of specimen
* `View`: whether ventral or dorsal view of specimen (all dorsal)
* `Species`: species of specimen (melpomene or erato)
* `Subspecies`: subspecies of the specimen
* `Sex`: sex of the specimen (male or female)
* `addit_taxa_info`: additional taxonomic information (subspecies)
* `type_stat`: indicates "classical" or "example" specimen of species or subspecies ('ST', 'PT', or 'HT', indicating syntypes, paratypes, or holotypes, respectively). This field is mostly null.
* `hybrid_stat`: hybrid status ('valid subspecies', 'subspecies synonym' or 'unknown' (only 1))
* `in_reduced`: whether or not the specimen was used in the second analysis by Hoyal Cuthill et al. (1 or 0 to indicate yes or no, respectively). This was an effort to remove potential hybrids from their analysis; it does not always match our indication of hybrid status.
* `locality`: where specimen was collected
* `lat`: latitude where specimen was collected
* `lon`: longitude where specimen was collected
* `speciesdesig`: species designation, first initial of species '.' subspecies (eg., 'm. rosina')
`Train_Test_Curated_GoldStandard_Hoyal_Cuthill.csv` has three additional columns:
* `Image_filename_png`: filename of (png) image of specimen, `dorsal_images_cuthill/ + <Image_filename_png>` is the filepath for the processed dorsal image
* `subset`: whether this is part of the training or test set (`train` or `test`)
* `filepath`: the filepath for the train or test image
`Acuity_Curated_GoldStandard_Hoyal_Cuthill.csv` also has `Image_filename_png` and `subset`, but `subset` references the bird and butterfly acuity training and test sets. Additionally, it has columns:
* `bird_filepath`: the filepath for the bird acuity version of the image
* `butterfly_filepath`: the filepath for the butterfly acuity version of the image
### Data Splits
There are 250 images in each training set and 70 in each test set.
RGB training images are in `train` folder and testing are in `test`.
For bird and butterfly acuities, their respective training and test images are in the `train_bird` (`train_butterfly`) and `test_bird` (`test_butterfly`) folders.
All of these folders are further subdivided by the subspecies. Filepaths to access these images are provided in `Train_Test_Curated_GoldStandard_Hoyal_Cuthill.csv` and `Acuity_Curated_GoldStandard_Hoyal_Cuthill.csv`, respectively.
## Dataset Creation
Processing steps included:
1. Hybrid Separation
2. Label Correction
3. Removal of subspecies with no mimic pairs
4. Make background uniform across all images
5. Make image square via padding
### Curation Rationale
This dataset was curated for training a model to classify different species of Heliconius Butterflies and to take into account mimicry between species and acuity of the observer (bird, butterfly, or human/other).
The original data (Hoyal Cuthill et al. 2019) had misclassified species/subspecies and some locality/ collection sites were outside the known range of the butterflies.
It also contained hybrid and aberrant samples, that had the potential to muddle classification results.
To prevent this, the data was further refined by several Heliconius experts to remove hybrid and aberrant samples.
Lastly, bird and butterfly acuities were added to provide another level of analysis using [AcuityView](https://cran.r-project.org/web/packages/AcuityView/index.html) and estimates of observer acuity.
### Source Data
Hoyal Cuthill et al. [doi:10.5061/dryad.2hp1978](https://doi.org/10.5061/dryad.2hp1978).
#### Initial Data Collection and Normalization
Photographers: Robyn Crowther and Sophie Ledger, Natural History Museum, London.
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
The original data has some misclassified species/subspecies, and had multiple hybrid samples. These samples were removed by hand by Owen McMilan, Christopher Lawrence, Jim Mallet, Krzysztof Kozak.
Some localities were outside the known range of the butterflies, and were removed using QGIS and known subspecies ranges.
#### Who are the annotators?
Christopher Lawrence,
Jim Mallet,
Owen McMilan, and
Krzysztof Kozak.
### Personal and Sensitive Information
None
## Considerations for Using the Data
### Social Impact of Dataset
N/A
### Discussion of Biases
Biased towards species and subspecies within Heliconius. Focused on _Heliconius erato_ and _Heliconius melpomene_.
### Other Known Limitations
* No genetic data available.
* Non-uniform distribution of subspecies (imbalanced).
## Additional Information
### Dataset Curators
* Krzysztof Kozak (University of California Berkeley) - ORCID: 0000-0001-8980-3173
* Christopher Lawrence (Princeton University) - ORCID: 0000-0002-3846-5968
* James Mallet (Harvard University) - ORCID: 0000-0002-3370-0367
* Owen McMillan (Smithsonian Tropical Research Institute) - ORCID: 0000-0003-2805-2745
* David Carlyn (The Ohio State University) - ORCID: 0000-0002-8323-0359
* Mohannad Elhamod (Virginia Tech) - ORCID: 0000-0002-2383-947X
### Licensing Information
This work has been marked as dedicated to the public domain by applying the [CC0 Public Domain Waiver](https://creativecommons.org/publicdomain/zero/1.0/).
### Citation Information
Krzysztof Kozak, Christopher Lawrence, James Mallet, Owen McMillan, David Carlyn, Mohannad Elhamod. (2023), "Curated GoldStandard Hoyal Cuthill", https://huggingface.co/datasets/imageomics/Curated_GoldStandard_Hoyal_Cuthill.
Ramesh Babu, R. (2023). _Imageomics Approach to Understanding Visual Biological Trait Similarities using Butterfly Mimicry as a Model System_ [Master's thesis, Ohio State University]. OhioLINK Electronic Theses and Dissertations Center. http://rave.ohiolink.edu/etdc/view?acc_num=osu168198420667979
Please also cite the original dataset from which this was adapted and its accompanying paper:
* Hoyal Cuthill, Jennifer F. et al. (2019), Data from: Deep learning on butterfly phenotypes tests evolution’s oldest mathematical model, Dryad, Dataset, https://doi.org/10.5061/dryad.2hp1978.
* Hoyal Cuthill, Jennifer F. et al. (2019), Deep learning on butterfly phenotypes tests evolution’s oldest mathematical model, Science Advances, Article-journal, https://doi.org/10.1126/sciadv.aaw4967.
#### BibTeX
Dataset:
```
@misc{CGSHC23,
author = {Krzysztof Kozak and Christopher Lawrence and James Mallet and Owen McMillan and David Carlyn and Mohannad Elhamod},
title = {Curated GoldStandard Hoyal Cuthill},
year = {2023},
url = {https://huggingface.co/datasets/imageomics/Curated_GoldStandard_Hoyal_Cuthill},
doi = {doi:10.57967/hf/1351},
publisher = {Hugging Face}
}
```
Imageomics paper (Part of thesis work):
```
@masterthesis{ramesh_babu23,
title = {Imageomics Approach to Understanding Visual Biological Trait Similarities using Butterfly Mimicry as a Model System},
author = {Reshma Ramesh Babu},
year = 2023,
month = {May},
note = {Available at \url{http://rave.ohiolink.edu/etdc/view?acc_num=osu168198420667979}},
school = {The Ohio State University},
type = {Master's thesis}
}
```
### Contributions
The [Imageomics Institute](https://imageomics.org) is funded by the US National Science Foundation's Harnessing the Data Revolution (HDR) program under [Award #2118240](https://www.nsf.gov/awardsearch/showAward?AWD_ID=2118240) (Imageomics: A New Frontier of Biological Information Powered by Knowledge-Guided Machine Learning). Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.
|
AMead10/wikipedia_20240320_en | ---
language:
- en
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 22057264844
num_examples: 6797834
download_size: 12695118248
dataset_size: 22057264844
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
Process english wikipedia dump from 20240320
Made using this repo [here](https://huggingface.co/datasets/wikipedia) |
open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v19.1-4k | ---
pretty_name: Evaluation run of OpenBuddy/openbuddy-mistral-7b-v19.1-4k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenBuddy/openbuddy-mistral-7b-v19.1-4k](https://huggingface.co/OpenBuddy/openbuddy-mistral-7b-v19.1-4k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v19.1-4k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-05T04:47:39.804021](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v19.1-4k/blob/main/results_2024-03-05T04-47-39.804021.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5697608239015811,\n\
\ \"acc_stderr\": 0.033746024067231935,\n \"acc_norm\": 0.575157090348363,\n\
\ \"acc_norm_stderr\": 0.03444032953538369,\n \"mc1\": 0.3292533659730722,\n\
\ \"mc1_stderr\": 0.016451264440068242,\n \"mc2\": 0.4825142403824786,\n\
\ \"mc2_stderr\": 0.0151353561803002\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5034129692832765,\n \"acc_stderr\": 0.014611050403244081,\n\
\ \"acc_norm\": 0.5341296928327645,\n \"acc_norm_stderr\": 0.014577311315231102\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5632344154550887,\n\
\ \"acc_stderr\": 0.004949716368890489,\n \"acc_norm\": 0.7457677753435571,\n\
\ \"acc_norm_stderr\": 0.004345388614520028\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796005,\n \
\ \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796005\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n\
\ \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n\
\ \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\
\ \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n\
\ \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3862433862433862,\n \"acc_stderr\": 0.025075981767601688,\n \"\
acc_norm\": 0.3862433862433862,\n \"acc_norm_stderr\": 0.025075981767601688\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.04375888492727061,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.04375888492727061\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6903225806451613,\n \"acc_stderr\": 0.026302774983517414,\n \"\
acc_norm\": 0.6903225806451613,\n \"acc_norm_stderr\": 0.026302774983517414\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4236453201970443,\n \"acc_stderr\": 0.03476725747649037,\n \"\
acc_norm\": 0.4236453201970443,\n \"acc_norm_stderr\": 0.03476725747649037\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198892,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198892\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.772020725388601,\n \"acc_stderr\": 0.030276909945178263,\n\
\ \"acc_norm\": 0.772020725388601,\n \"acc_norm_stderr\": 0.030276909945178263\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5615384615384615,\n \"acc_stderr\": 0.025158266016868595,\n\
\ \"acc_norm\": 0.5615384615384615,\n \"acc_norm_stderr\": 0.025158266016868595\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473075,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6134453781512605,\n \"acc_stderr\": 0.03163145807552379,\n \
\ \"acc_norm\": 0.6134453781512605,\n \"acc_norm_stderr\": 0.03163145807552379\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7394495412844037,\n \"acc_stderr\": 0.01881918203485007,\n \"\
acc_norm\": 0.7394495412844037,\n \"acc_norm_stderr\": 0.01881918203485007\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7303921568627451,\n \"acc_stderr\": 0.031145570659486782,\n \"\
acc_norm\": 0.7303921568627451,\n \"acc_norm_stderr\": 0.031145570659486782\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n\
\ \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.6278026905829597,\n\
\ \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n\
\ \"acc_stderr\": 0.045879047413018105,\n \"acc_norm\": 0.6574074074074074,\n\
\ \"acc_norm_stderr\": 0.045879047413018105\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7509578544061303,\n\
\ \"acc_stderr\": 0.015464676163395953,\n \"acc_norm\": 0.7509578544061303,\n\
\ \"acc_norm_stderr\": 0.015464676163395953\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6040462427745664,\n \"acc_stderr\": 0.02632981334194624,\n\
\ \"acc_norm\": 0.6040462427745664,\n \"acc_norm_stderr\": 0.02632981334194624\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2927374301675978,\n\
\ \"acc_stderr\": 0.015218109544410172,\n \"acc_norm\": 0.2927374301675978,\n\
\ \"acc_norm_stderr\": 0.015218109544410172\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.02705797462449438,\n\
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.02705797462449438\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6366559485530546,\n\
\ \"acc_stderr\": 0.027316847674192707,\n \"acc_norm\": 0.6366559485530546,\n\
\ \"acc_norm_stderr\": 0.027316847674192707\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.027431623722415,\n\
\ \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.027431623722415\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3971631205673759,\n \"acc_stderr\": 0.029189805673587095,\n \
\ \"acc_norm\": 0.3971631205673759,\n \"acc_norm_stderr\": 0.029189805673587095\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41199478487614083,\n\
\ \"acc_stderr\": 0.01257087103214607,\n \"acc_norm\": 0.41199478487614083,\n\
\ \"acc_norm_stderr\": 0.01257087103214607\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.030320243265004137,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.030320243265004137\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5620915032679739,\n \"acc_stderr\": 0.020071257886886525,\n \
\ \"acc_norm\": 0.5620915032679739,\n \"acc_norm_stderr\": 0.020071257886886525\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6816326530612244,\n \"acc_stderr\": 0.029822533793982066,\n\
\ \"acc_norm\": 0.6816326530612244,\n \"acc_norm_stderr\": 0.029822533793982066\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n\
\ \"acc_stderr\": 0.02823136509275841,\n \"acc_norm\": 0.8009950248756219,\n\
\ \"acc_norm_stderr\": 0.02823136509275841\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7485380116959064,\n \"acc_stderr\": 0.033275044238468436,\n\
\ \"acc_norm\": 0.7485380116959064,\n \"acc_norm_stderr\": 0.033275044238468436\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3292533659730722,\n\
\ \"mc1_stderr\": 0.016451264440068242,\n \"mc2\": 0.4825142403824786,\n\
\ \"mc2_stderr\": 0.0151353561803002\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6992896606156275,\n \"acc_stderr\": 0.012888010494704736\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.33510235026535257,\n \
\ \"acc_stderr\": 0.013001948176422952\n }\n}\n```"
repo_url: https://huggingface.co/OpenBuddy/openbuddy-mistral-7b-v19.1-4k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|arc:challenge|25_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|gsm8k|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hellaswag|10_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-05T04-47-39.804021.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-05T04-47-39.804021.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- '**/details_harness|winogrande|5_2024-03-05T04-47-39.804021.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-05T04-47-39.804021.parquet'
- config_name: results
data_files:
- split: 2024_03_05T04_47_39.804021
path:
- results_2024-03-05T04-47-39.804021.parquet
- split: latest
path:
- results_2024-03-05T04-47-39.804021.parquet
---
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-mistral-7b-v19.1-4k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-mistral-7b-v19.1-4k](https://huggingface.co/OpenBuddy/openbuddy-mistral-7b-v19.1-4k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v19.1-4k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-05T04:47:39.804021](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v19.1-4k/blob/main/results_2024-03-05T04-47-39.804021.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5697608239015811,
"acc_stderr": 0.033746024067231935,
"acc_norm": 0.575157090348363,
"acc_norm_stderr": 0.03444032953538369,
"mc1": 0.3292533659730722,
"mc1_stderr": 0.016451264440068242,
"mc2": 0.4825142403824786,
"mc2_stderr": 0.0151353561803002
},
"harness|arc:challenge|25": {
"acc": 0.5034129692832765,
"acc_stderr": 0.014611050403244081,
"acc_norm": 0.5341296928327645,
"acc_norm_stderr": 0.014577311315231102
},
"harness|hellaswag|10": {
"acc": 0.5632344154550887,
"acc_stderr": 0.004949716368890489,
"acc_norm": 0.7457677753435571,
"acc_norm_stderr": 0.004345388614520028
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6226415094339622,
"acc_stderr": 0.029832808114796005,
"acc_norm": 0.6226415094339622,
"acc_norm_stderr": 0.029832808114796005
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006717,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006717
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939392,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939392
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3862433862433862,
"acc_stderr": 0.025075981767601688,
"acc_norm": 0.3862433862433862,
"acc_norm_stderr": 0.025075981767601688
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.04375888492727061,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.04375888492727061
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6903225806451613,
"acc_stderr": 0.026302774983517414,
"acc_norm": 0.6903225806451613,
"acc_norm_stderr": 0.026302774983517414
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4236453201970443,
"acc_stderr": 0.03476725747649037,
"acc_norm": 0.4236453201970443,
"acc_norm_stderr": 0.03476725747649037
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198892,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198892
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.772020725388601,
"acc_stderr": 0.030276909945178263,
"acc_norm": 0.772020725388601,
"acc_norm_stderr": 0.030276909945178263
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5615384615384615,
"acc_stderr": 0.025158266016868595,
"acc_norm": 0.5615384615384615,
"acc_norm_stderr": 0.025158266016868595
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473075,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6134453781512605,
"acc_stderr": 0.03163145807552379,
"acc_norm": 0.6134453781512605,
"acc_norm_stderr": 0.03163145807552379
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7394495412844037,
"acc_stderr": 0.01881918203485007,
"acc_norm": 0.7394495412844037,
"acc_norm_stderr": 0.01881918203485007
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7303921568627451,
"acc_stderr": 0.031145570659486782,
"acc_norm": 0.7303921568627451,
"acc_norm_stderr": 0.031145570659486782
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6278026905829597,
"acc_stderr": 0.03244305283008731,
"acc_norm": 0.6278026905829597,
"acc_norm_stderr": 0.03244305283008731
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.045879047413018105,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.045879047413018105
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7509578544061303,
"acc_stderr": 0.015464676163395953,
"acc_norm": 0.7509578544061303,
"acc_norm_stderr": 0.015464676163395953
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6040462427745664,
"acc_stderr": 0.02632981334194624,
"acc_norm": 0.6040462427745664,
"acc_norm_stderr": 0.02632981334194624
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2927374301675978,
"acc_stderr": 0.015218109544410172,
"acc_norm": 0.2927374301675978,
"acc_norm_stderr": 0.015218109544410172
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.02705797462449438,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.02705797462449438
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6366559485530546,
"acc_stderr": 0.027316847674192707,
"acc_norm": 0.6366559485530546,
"acc_norm_stderr": 0.027316847674192707
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.027431623722415,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.027431623722415
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3971631205673759,
"acc_stderr": 0.029189805673587095,
"acc_norm": 0.3971631205673759,
"acc_norm_stderr": 0.029189805673587095
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41199478487614083,
"acc_stderr": 0.01257087103214607,
"acc_norm": 0.41199478487614083,
"acc_norm_stderr": 0.01257087103214607
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.030320243265004137,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.030320243265004137
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5620915032679739,
"acc_stderr": 0.020071257886886525,
"acc_norm": 0.5620915032679739,
"acc_norm_stderr": 0.020071257886886525
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6816326530612244,
"acc_stderr": 0.029822533793982066,
"acc_norm": 0.6816326530612244,
"acc_norm_stderr": 0.029822533793982066
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8009950248756219,
"acc_stderr": 0.02823136509275841,
"acc_norm": 0.8009950248756219,
"acc_norm_stderr": 0.02823136509275841
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7485380116959064,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.7485380116959064,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3292533659730722,
"mc1_stderr": 0.016451264440068242,
"mc2": 0.4825142403824786,
"mc2_stderr": 0.0151353561803002
},
"harness|winogrande|5": {
"acc": 0.6992896606156275,
"acc_stderr": 0.012888010494704736
},
"harness|gsm8k|5": {
"acc": 0.33510235026535257,
"acc_stderr": 0.013001948176422952
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
TenzinGayche/TTS_nocs_pd_b1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: labels
sequence:
sequence: float32
- name: path
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: speaker_embeddings
sequence: float32
splits:
- name: train
num_bytes: 25886464618.0
num_examples: 92167
download_size: 1609511662
dataset_size: 25886464618.0
---
# Dataset Card for "TTS_nocs_pd_b1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cahya/instructions-sw | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 917585.3073463269
num_examples: 1800
- name: test
num_bytes: 51486.73113443278
num_examples: 101
- name: validation
num_bytes: 50976.96151924038
num_examples: 100
download_size: 581487
dataset_size: 1020049.0000000001
---
# Dataset Card for "instructions-sw"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ABC-iRobotics/SMVB | ---
language:
- en
license: gpl-3.0
tags:
- vision
- image-segmentation
- instance-segmentation
- object-detection
- optical-flow
- depth
- synthetic
- sim-to-real
annotations_creators:
- machine-generated
pretty_name: SMVB Dataset
size_categories:
- 1K<n<10K
task_categories:
- object-detection
- image-segmentation
- depth-estimation
- video-classification
- other
task_ids:
- instance-segmentation
- semantic-segmentation
---
# Synthetic Multimodal Video Benchmark (SMVB)
A dataset consisting of synthetic images from distinct synthetic scenes, annotated with object/instance/semantic segmentation masks, depth data, surface normal information and optical for testing and benchmarking model performance for multi-task/multi-objective learning.
### Supported Tasks and Leaderboards
The dataset supports tasks such as semantic segmentation, instance segmentation, object detection, image classification, depth, surface normal, and optical flow estimation, and video object segmentation.
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
### Citation Information
```bibtex
@INPROCEEDINGS{karoly2024synthetic,
author={Károly, Artúr I. and Nádas, Imre and Galambos, Péter},
booktitle={2024 IEEE 22nd World Symposium on Applied Machine Intelligence and Informatics (SAMI)},
title={Synthetic Multimodal Video Benchmark (SMVB): Utilizing Blender for rich dataset generation},
year={2024},
volume={},
number={},
pages={},
doi={}}
``` |
zzigakovacic/WeatherCorn | ---
task_categories:
- text-to-image
language:
- en
size_categories:
- n<1K
--- |
DheerajNalapat/code_correction | ---
license: mit
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 6024669
num_examples: 7770
download_size: 1444414
dataset_size: 6024669
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
PJMixers/Anthropic_persuasion-ShareGPT | ---
language:
- en
size_categories:
- 1K<n<10K
---
Added the prompts which were listed in the [blog post](https://www.anthropic.com/news/measuring-model-persuasiveness). Skipped the control samples. |
HamdanXI/lj-inprogress | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 22050
- name: text
dtype: string
splits:
- name: train
num_bytes: 3856520559.0
num_examples: 13100
download_size: 3784764912
dataset_size: 3856520559.0
---
# Dataset Card for "lj-inprogress"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arize-ai/xtreme_en | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- mit
multilinguality:
- monolingual
pretty_name: named-entity-recognition-en-no-drift
size_categories:
- 10K<n<100K
source_datasets:
- extended|xtreme
task_categories:
- token-classification
task_ids:
- named-entity-recognition
---
# Dataset Card for `reviews_with_drift`
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
### Dataset Summary
This dataset was crafted to be used in our tutorial [Link to the tutorial when ready]. It consists on a large Movie Review Dataset mixed with some reviews from a Hotel Review Dataset. The training/validation set are purely obtained from the Movie Review Dataset while the production set is mixed. Some other features have been added (`age`, `gender`, `context`) as well as a made up timestamp `prediction_ts` of when the inference took place.
### Supported Tasks and Leaderboards
`text-classification`, `sentiment-classification`: The dataset is mainly used for text classification: given the text, predict the sentiment (positive or negative).
### Languages
Text is mainly written in english.
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@fjcasti1](https://github.com/fjcasti1) for adding this dataset. |
ineoApp/new_factures_DS | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: bboxes
sequence:
sequence: int64
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': reference
'2': numero facture
'3': fournisseur
'4': date facture
'5': date limite
'6': montant ht
'7': tva
'8': montant ttc
'9': unitP
'10': prix tva
'11': addresse
'12': art1 prix unit
'13': art1 designation
'14': art1 quantite
'15': art1 tva
'16': art1 montant ht
'17': art2 designation
'18': art2 quantite
'19': art2 prix unit
'20': art2 tva
'21': art2 montant ht
- name: tokens
sequence: string
splits:
- name: train
num_bytes: 3977231.3333333335
num_examples: 4
- name: test
num_bytes: 1988615.6666666667
num_examples: 2
download_size: 5955040
dataset_size: 5965847.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_yanolja__EEVE-Korean-2.8B-v1.0 | ---
pretty_name: Evaluation run of yanolja/EEVE-Korean-2.8B-v1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yanolja/EEVE-Korean-2.8B-v1.0](https://huggingface.co/yanolja/EEVE-Korean-2.8B-v1.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yanolja__EEVE-Korean-2.8B-v1.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-24T14:38:05.413550](https://huggingface.co/datasets/open-llm-leaderboard/details_yanolja__EEVE-Korean-2.8B-v1.0/blob/main/results_2024-02-24T14-38-05.413550.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.518047591724302,\n\
\ \"acc_stderr\": 0.03424766472225095,\n \"acc_norm\": 0.520665328134594,\n\
\ \"acc_norm_stderr\": 0.034963937392097244,\n \"mc1\": 0.3023255813953488,\n\
\ \"mc1_stderr\": 0.01607750926613303,\n \"mc2\": 0.4427029480935236,\n\
\ \"mc2_stderr\": 0.014593310268993558\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5401023890784983,\n \"acc_stderr\": 0.01456431885692485,\n\
\ \"acc_norm\": 0.5725255972696246,\n \"acc_norm_stderr\": 0.014456862944650645\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5344552877912766,\n\
\ \"acc_stderr\": 0.004977919906875367,\n \"acc_norm\": 0.7214698267277435,\n\
\ \"acc_norm_stderr\": 0.004473595650807671\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4934210526315789,\n \"acc_stderr\": 0.040685900502249704,\n\
\ \"acc_norm\": 0.4934210526315789,\n \"acc_norm_stderr\": 0.040685900502249704\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5547169811320755,\n \"acc_stderr\": 0.030588052974270655,\n\
\ \"acc_norm\": 0.5547169811320755,\n \"acc_norm_stderr\": 0.030588052974270655\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670788,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670788\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.49710982658959535,\n\
\ \"acc_stderr\": 0.03812400565974833,\n \"acc_norm\": 0.49710982658959535,\n\
\ \"acc_norm_stderr\": 0.03812400565974833\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n\
\ \"acc_stderr\": 0.044895393502707,\n \"acc_norm\": 0.3508771929824561,\n\
\ \"acc_norm_stderr\": 0.044895393502707\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424649,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424649\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.042407993275749255,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.042407993275749255\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6645161290322581,\n\
\ \"acc_stderr\": 0.02686020644472435,\n \"acc_norm\": 0.6645161290322581,\n\
\ \"acc_norm_stderr\": 0.02686020644472435\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4039408866995074,\n \"acc_stderr\": 0.0345245390382204,\n\
\ \"acc_norm\": 0.4039408866995074,\n \"acc_norm_stderr\": 0.0345245390382204\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6565656565656566,\n \"acc_stderr\": 0.033832012232444426,\n \"\
acc_norm\": 0.6565656565656566,\n \"acc_norm_stderr\": 0.033832012232444426\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7046632124352331,\n \"acc_stderr\": 0.03292296639155141,\n\
\ \"acc_norm\": 0.7046632124352331,\n \"acc_norm_stderr\": 0.03292296639155141\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.517948717948718,\n \"acc_stderr\": 0.02533466708095492,\n \
\ \"acc_norm\": 0.517948717948718,\n \"acc_norm_stderr\": 0.02533466708095492\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968352,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968352\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5210084033613446,\n \"acc_stderr\": 0.032449808499900284,\n\
\ \"acc_norm\": 0.5210084033613446,\n \"acc_norm_stderr\": 0.032449808499900284\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7009174311926606,\n \"acc_stderr\": 0.019630417285415185,\n \"\
acc_norm\": 0.7009174311926606,\n \"acc_norm_stderr\": 0.019630417285415185\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.36574074074074076,\n \"acc_stderr\": 0.03284738857647207,\n \"\
acc_norm\": 0.36574074074074076,\n \"acc_norm_stderr\": 0.03284738857647207\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.28921568627450983,\n \"acc_stderr\": 0.031822318676475544,\n \"\
acc_norm\": 0.28921568627450983,\n \"acc_norm_stderr\": 0.031822318676475544\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6582278481012658,\n \"acc_stderr\": 0.030874537537553617,\n \
\ \"acc_norm\": 0.6582278481012658,\n \"acc_norm_stderr\": 0.030874537537553617\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6203703703703703,\n\
\ \"acc_stderr\": 0.04691521224077742,\n \"acc_norm\": 0.6203703703703703,\n\
\ \"acc_norm_stderr\": 0.04691521224077742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6503067484662577,\n \"acc_stderr\": 0.03746668325470021,\n\
\ \"acc_norm\": 0.6503067484662577,\n \"acc_norm_stderr\": 0.03746668325470021\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6504854368932039,\n \"acc_stderr\": 0.04721188506097172,\n\
\ \"acc_norm\": 0.6504854368932039,\n \"acc_norm_stderr\": 0.04721188506097172\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8247863247863247,\n\
\ \"acc_stderr\": 0.024904439098918242,\n \"acc_norm\": 0.8247863247863247,\n\
\ \"acc_norm_stderr\": 0.024904439098918242\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.644955300127714,\n\
\ \"acc_stderr\": 0.01711208577277299,\n \"acc_norm\": 0.644955300127714,\n\
\ \"acc_norm_stderr\": 0.01711208577277299\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.02636243757454654,\n\
\ \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.02636243757454654\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.565359477124183,\n \"acc_stderr\": 0.02838425670488304,\n\
\ \"acc_norm\": 0.565359477124183,\n \"acc_norm_stderr\": 0.02838425670488304\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5241157556270096,\n\
\ \"acc_stderr\": 0.028365041542564577,\n \"acc_norm\": 0.5241157556270096,\n\
\ \"acc_norm_stderr\": 0.028365041542564577\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.027648477877413317,\n\
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.027648477877413317\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3829787234042553,\n \"acc_stderr\": 0.02899908090480618,\n \
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.02899908090480618\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3983050847457627,\n\
\ \"acc_stderr\": 0.012503310565166258,\n \"acc_norm\": 0.3983050847457627,\n\
\ \"acc_norm_stderr\": 0.012503310565166258\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.375,\n \"acc_stderr\": 0.029408372932278746,\n \
\ \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.029408372932278746\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.511437908496732,\n \"acc_stderr\": 0.020222541515610874,\n \
\ \"acc_norm\": 0.511437908496732,\n \"acc_norm_stderr\": 0.020222541515610874\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6040816326530613,\n \"acc_stderr\": 0.03130802899065686,\n\
\ \"acc_norm\": 0.6040816326530613,\n \"acc_norm_stderr\": 0.03130802899065686\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n\
\ \"acc_stderr\": 0.03076944496729602,\n \"acc_norm\": 0.746268656716418,\n\
\ \"acc_norm_stderr\": 0.03076944496729602\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6608187134502924,\n \"acc_stderr\": 0.03631053496488904,\n\
\ \"acc_norm\": 0.6608187134502924,\n \"acc_norm_stderr\": 0.03631053496488904\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3023255813953488,\n\
\ \"mc1_stderr\": 0.01607750926613303,\n \"mc2\": 0.4427029480935236,\n\
\ \"mc2_stderr\": 0.014593310268993558\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7371744277821626,\n \"acc_stderr\": 0.012370922527262008\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3639120545868082,\n \
\ \"acc_stderr\": 0.013252539227966199\n }\n}\n```"
repo_url: https://huggingface.co/yanolja/EEVE-Korean-2.8B-v1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|arc:challenge|25_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|gsm8k|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hellaswag|10_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-24T14-38-05.413550.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-24T14-38-05.413550.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- '**/details_harness|winogrande|5_2024-02-24T14-38-05.413550.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-24T14-38-05.413550.parquet'
- config_name: results
data_files:
- split: 2024_02_24T14_38_05.413550
path:
- results_2024-02-24T14-38-05.413550.parquet
- split: latest
path:
- results_2024-02-24T14-38-05.413550.parquet
---
# Dataset Card for Evaluation run of yanolja/EEVE-Korean-2.8B-v1.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yanolja/EEVE-Korean-2.8B-v1.0](https://huggingface.co/yanolja/EEVE-Korean-2.8B-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yanolja__EEVE-Korean-2.8B-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-24T14:38:05.413550](https://huggingface.co/datasets/open-llm-leaderboard/details_yanolja__EEVE-Korean-2.8B-v1.0/blob/main/results_2024-02-24T14-38-05.413550.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.518047591724302,
"acc_stderr": 0.03424766472225095,
"acc_norm": 0.520665328134594,
"acc_norm_stderr": 0.034963937392097244,
"mc1": 0.3023255813953488,
"mc1_stderr": 0.01607750926613303,
"mc2": 0.4427029480935236,
"mc2_stderr": 0.014593310268993558
},
"harness|arc:challenge|25": {
"acc": 0.5401023890784983,
"acc_stderr": 0.01456431885692485,
"acc_norm": 0.5725255972696246,
"acc_norm_stderr": 0.014456862944650645
},
"harness|hellaswag|10": {
"acc": 0.5344552877912766,
"acc_stderr": 0.004977919906875367,
"acc_norm": 0.7214698267277435,
"acc_norm_stderr": 0.004473595650807671
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4934210526315789,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.4934210526315789,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5547169811320755,
"acc_stderr": 0.030588052974270655,
"acc_norm": 0.5547169811320755,
"acc_norm_stderr": 0.030588052974270655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670788,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670788
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.49710982658959535,
"acc_stderr": 0.03812400565974833,
"acc_norm": 0.49710982658959535,
"acc_norm_stderr": 0.03812400565974833
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006717,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006717
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.044895393502707,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.044895393502707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.02519710107424649,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.02519710107424649
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.042407993275749255,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.042407993275749255
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.02686020644472435,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.02686020644472435
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4039408866995074,
"acc_stderr": 0.0345245390382204,
"acc_norm": 0.4039408866995074,
"acc_norm_stderr": 0.0345245390382204
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6565656565656566,
"acc_stderr": 0.033832012232444426,
"acc_norm": 0.6565656565656566,
"acc_norm_stderr": 0.033832012232444426
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7046632124352331,
"acc_stderr": 0.03292296639155141,
"acc_norm": 0.7046632124352331,
"acc_norm_stderr": 0.03292296639155141
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.517948717948718,
"acc_stderr": 0.02533466708095492,
"acc_norm": 0.517948717948718,
"acc_norm_stderr": 0.02533466708095492
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.02822644674968352,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.02822644674968352
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5210084033613446,
"acc_stderr": 0.032449808499900284,
"acc_norm": 0.5210084033613446,
"acc_norm_stderr": 0.032449808499900284
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7009174311926606,
"acc_stderr": 0.019630417285415185,
"acc_norm": 0.7009174311926606,
"acc_norm_stderr": 0.019630417285415185
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.36574074074074076,
"acc_stderr": 0.03284738857647207,
"acc_norm": 0.36574074074074076,
"acc_norm_stderr": 0.03284738857647207
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.28921568627450983,
"acc_stderr": 0.031822318676475544,
"acc_norm": 0.28921568627450983,
"acc_norm_stderr": 0.031822318676475544
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6582278481012658,
"acc_stderr": 0.030874537537553617,
"acc_norm": 0.6582278481012658,
"acc_norm_stderr": 0.030874537537553617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6053811659192825,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.6053811659192825,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.04691521224077742,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.04691521224077742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6503067484662577,
"acc_stderr": 0.03746668325470021,
"acc_norm": 0.6503067484662577,
"acc_norm_stderr": 0.03746668325470021
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.6504854368932039,
"acc_stderr": 0.04721188506097172,
"acc_norm": 0.6504854368932039,
"acc_norm_stderr": 0.04721188506097172
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8247863247863247,
"acc_stderr": 0.024904439098918242,
"acc_norm": 0.8247863247863247,
"acc_norm_stderr": 0.024904439098918242
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.644955300127714,
"acc_stderr": 0.01711208577277299,
"acc_norm": 0.644955300127714,
"acc_norm_stderr": 0.01711208577277299
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.02636243757454654,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.02636243757454654
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.565359477124183,
"acc_stderr": 0.02838425670488304,
"acc_norm": 0.565359477124183,
"acc_norm_stderr": 0.02838425670488304
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5241157556270096,
"acc_stderr": 0.028365041542564577,
"acc_norm": 0.5241157556270096,
"acc_norm_stderr": 0.028365041542564577
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.027648477877413317,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.027648477877413317
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.02899908090480618,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.02899908090480618
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3983050847457627,
"acc_stderr": 0.012503310565166258,
"acc_norm": 0.3983050847457627,
"acc_norm_stderr": 0.012503310565166258
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.375,
"acc_stderr": 0.029408372932278746,
"acc_norm": 0.375,
"acc_norm_stderr": 0.029408372932278746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.511437908496732,
"acc_stderr": 0.020222541515610874,
"acc_norm": 0.511437908496732,
"acc_norm_stderr": 0.020222541515610874
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6040816326530613,
"acc_stderr": 0.03130802899065686,
"acc_norm": 0.6040816326530613,
"acc_norm_stderr": 0.03130802899065686
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.746268656716418,
"acc_stderr": 0.03076944496729602,
"acc_norm": 0.746268656716418,
"acc_norm_stderr": 0.03076944496729602
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6608187134502924,
"acc_stderr": 0.03631053496488904,
"acc_norm": 0.6608187134502924,
"acc_norm_stderr": 0.03631053496488904
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3023255813953488,
"mc1_stderr": 0.01607750926613303,
"mc2": 0.4427029480935236,
"mc2_stderr": 0.014593310268993558
},
"harness|winogrande|5": {
"acc": 0.7371744277821626,
"acc_stderr": 0.012370922527262008
},
"harness|gsm8k|5": {
"acc": 0.3639120545868082,
"acc_stderr": 0.013252539227966199
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
tollefj/skolegpt-no | ---
dataset_info:
features:
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 23800483
num_examples: 17799
download_size: 12784581
dataset_size: 23800483
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Multimodal-Fatima/VQAv2_minival_validation_sample | ---
dataset_info:
features:
- name: question_type
dtype: string
- name: multiple_choice_answer
dtype: string
- name: answers
sequence: string
- name: answers_original
list:
- name: answer
dtype: string
- name: answer_confidence
dtype: string
- name: answer_id
dtype: int64
- name: id_image
dtype: int64
- name: answer_type
dtype: string
- name: question_id
dtype: int64
- name: question
dtype: string
- name: image
dtype: image
- name: clip_tags_ViT_L_14
sequence: string
- name: blip_caption
dtype: string
- name: LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14
sequence: string
- name: DETA_detections_deta_swin_large_o365_coco_classes
list:
- name: attribute
dtype: string
- name: box
sequence: float32
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float32
- name: size
dtype: string
- name: tag
dtype: string
- name: DETA_detections_deta_swin_large_o365_clip_ViT_L_14
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
- name: DETA_detections_deta_swin_large_o365_clip_ViT_L_14_blip_caption
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: caption
dtype: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
- name: id
dtype: int64
- name: DETA_detections_deta_swin_large_o365_clip_ViT_L_14_blip_caption_caption_module
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: caption
dtype: string
- name: captions_module
sequence: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
- name: DETA_detections_deta_swin_large_o365_clip_ViT_L_14_blip_caption_caption_module_without_filtering
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: caption
dtype: string
- name: captions_module
sequence: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
- name: DETA_detections_deta_swin_large_o365_clip_ViT_L_14_blip_caption_caption_module_random
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: caption
dtype: string
- name: captions_module
sequence: string
- name: captions_module_filter
sequence: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
- name: clip_tags_LAION_ViT_H_14_2B
sequence: string
- name: LLM_Description_gpt3_downstream_tasks_visual_genome_LAION-ViT-H-14-2B
sequence: string
- name: Attributes_ViT_L_14_descriptors_text_davinci_003_full
sequence: string
- name: clip_tags_ViT_L_14_wo_openai
sequence: string
- name: clip_tags_ViT_L_14_with_openai
sequence: string
- name: clip_tags_LAION_ViT_H_14_2B_wo_openai
sequence: string
- name: clip_tags_LAION_ViT_H_14_2B_with_openai
sequence: string
- name: clip_tags_LAION_ViT_bigG_14_2B_wo_openai
sequence: string
- name: clip_tags_LAION_ViT_bigG_14_2B_with_openai
sequence: string
- name: Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full
sequence: string
- name: Attributes_LAION_ViT_bigG_14_2B_descriptors_text_davinci_003_full
sequence: string
- name: clip_tags_ViT_B_16_with_openai
sequence: string
- name: blip_caption_beam_5_Salesforce_blip2_flan_t5_xxl
dtype: string
- name: DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large_
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: captions_all_patches
sequence: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
splits:
- name: validation
num_bytes: 32906198.0
num_examples: 100
download_size: 8017526
dataset_size: 32906198.0
---
# Dataset Card for "VQAv2_minival_validation_sample"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ayuhamaro/ner-model-tune | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- zh
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- named-entity-recognition
paperswithcode_id: nlp-model-tune
pretty_name: NER Model Tune
train-eval-index:
- config: default
task: token-classification
task_id: entity_extraction
splits:
train_split: train
eval_split: test
col_mapping:
tokens: tokens
ner_tags: tags
metrics:
- type: seqeval
name: seqeval
dataset_info:
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O,
'1': B-CARDINAL,
'2': B-DATE,
'3': B-EVENT,
'4': B-FAC,
'5': B-GPE,
'6': B-LANGUAGE,
'7': B-LAW,
'8': B-LOC,
'9': B-MONEY,
'10': B-NORP,
'11': B-ORDINAL,
'12': B-ORG,
'13': B-PERCENT,
'14': B-PERSON,
'15': B-PRODUCT,
'16': B-QUANTITY,
'17': B-TIME,
'18': B-WORK_OF_ART,
'19': I-CARDINAL,
'20': I-DATE,
'21': I-EVENT,
'22': I-FAC,
'23': I-GPE,
'24': I-LANGUAGE,
'25': I-LAW,
'26': I-LOC,
'27': I-MONEY,
'28': I-NORP,
'29': I-ORDINAL,
'30': I-ORG,
'31': I-PERCENT,
'32': I-PERSON,
'33': I-PRODUCT,
'34': I-QUANTITY,
'35': I-TIME,
'36': I-WORK_OF_ART,
'37': E-CARDINAL,
'38': E-DATE,
'39': E-EVENT,
'40': E-FAC,
'41': E-GPE,
'42': E-LANGUAGE,
'43': E-LAW,
'44': E-LOC,
'45': E-MONEY,
'46': E-NORP,
'47': E-ORDINAL,
'48': E-ORG,
'49': E-PERCENT,
'50': E-PERSON,
'51': E-PRODUCT,
'52': E-QUANTITY,
'53': E-TIME,
'54': E-WORK_OF_ART,
'55': S-CARDINAL,
'56': S-DATE,
'57': S-EVENT,
'58': S-FAC,
'59': S-GPE,
'60': S-LANGUAGE,
'61': S-LAW,
'62': S-LOC,
'63': S-MONEY,
'64': S-NORP,
'65': S-ORDINAL,
'66': S-ORG,
'67': S-PERCENT,
'68': S-PERSON,
'69': S-PRODUCT,
'70': S-QUANTITY,
'71': S-TIME,
'72': S-WORK_OF_ART
splits:
- name: train
num_bytes: 568
num_examples: 1
download_size: 568
dataset_size: 568
---
# Dataset Card for "NER Model Tune"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** None
- **Repository:** https://huggingface.co/datasets/ayuhamaro/nlp-model-tune
- **Paper:** [More Information Needed]
- **Leaderboard:** [If the dataset supports an active leaderboard, add link here]()
- **Point of Contact:** [More Information Needed]
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions |
ky552/cszs_es_en | ---
dataset_info:
features:
- name: correct_audio
dtype: audio
- name: correct_transcription
dtype: string
- name: correct_file
dtype: string
- name: wrong_audio
dtype: audio
- name: wrong_transcription
dtype: string
- name: wrong_file
dtype: string
splits:
- name: train
num_bytes: 30462716413.44
num_examples: 129220
- name: dev
num_bytes: 3325102230.576
num_examples: 13866
- name: test
num_bytes: 3209609145.2
num_examples: 13740
download_size: 36453700196
dataset_size: 36997427789.215996
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
- split: test
path: data/test-*
license: mit
language:
- en
- es
--- |
Santosh-Gupta/EncephalitisAbstracts | ---
license: mit
dataset_info:
features:
- name: pmid
dtype: int64
- name: title
dtype: string
- name: abstract
dtype: string
- name: authors
sequence: string
- name: journal_title
dtype: string
- name: issn
dtype: string
- name: publication_date
dtype: string
- name: doi
dtype: string
- name: keywords
sequence: string
splits:
- name: train
num_bytes: 60135099
num_examples: 47714
download_size: 35417285
dataset_size: 60135099
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-eval-mathemakitten__winobias_antistereotype_test_cot_v4-math-54ae93-2018366737 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- mathemakitten/winobias_antistereotype_test_cot_v4
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-350m_eval
metrics: []
dataset_name: mathemakitten/winobias_antistereotype_test_cot_v4
dataset_config: mathemakitten--winobias_antistereotype_test_cot_v4
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-350m_eval
* Dataset: mathemakitten/winobias_antistereotype_test_cot_v4
* Config: mathemakitten--winobias_antistereotype_test_cot_v4
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@mathemakitten](https://huggingface.co/mathemakitten) for evaluating this model. |
i4ds/test_dataset_mp3_vs_wav | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
- name: language
dtype: string
- name: prompt
dtype: string
splits:
- name: mp3
num_bytes: 274300.0
num_examples: 3
- name: wav
num_bytes: 2882995.0
num_examples: 3
download_size: 2892030
dataset_size: 3157295.0
configs:
- config_name: default
data_files:
- split: mp3
path: data/mp3-*
- split: wav
path: data/wav-*
---
|
open-llm-leaderboard/details_jeiku__Soulful_Bepis_7B | ---
pretty_name: Evaluation run of jeiku/Soulful_Bepis_7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jeiku/Soulful_Bepis_7B](https://huggingface.co/jeiku/Soulful_Bepis_7B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jeiku__Soulful_Bepis_7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-04T08:34:30.661072](https://huggingface.co/datasets/open-llm-leaderboard/details_jeiku__Soulful_Bepis_7B/blob/main/results_2024-03-04T08-34-30.661072.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6233219424488493,\n\
\ \"acc_stderr\": 0.03282744237626033,\n \"acc_norm\": 0.6286298883234799,\n\
\ \"acc_norm_stderr\": 0.03349039401734607,\n \"mc1\": 0.3990208078335373,\n\
\ \"mc1_stderr\": 0.017142825728496767,\n \"mc2\": 0.5665172743251056,\n\
\ \"mc2_stderr\": 0.015333488256500133\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6092150170648464,\n \"acc_stderr\": 0.01425856388051378,\n\
\ \"acc_norm\": 0.6382252559726962,\n \"acc_norm_stderr\": 0.014041957945038076\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.617307309300936,\n\
\ \"acc_stderr\": 0.0048505089451160895,\n \"acc_norm\": 0.8069109739095798,\n\
\ \"acc_norm_stderr\": 0.003939155484500653\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03782728980865469,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03782728980865469\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.03800968060554859,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.03800968060554859\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7419354838709677,\n \"acc_stderr\": 0.02489246917246283,\n \"\
acc_norm\": 0.7419354838709677,\n \"acc_norm_stderr\": 0.02489246917246283\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"\
acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659356,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659356\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.02469721693087894,\n \
\ \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.02469721693087894\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.0386155754625517,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.0386155754625517\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8256880733944955,\n \"acc_stderr\": 0.01626567563201034,\n \"\
acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.01626567563201034\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251735,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251735\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.039153454088478354,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.039153454088478354\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709696,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709696\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.04453197507374984,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.04453197507374984\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n\
\ \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n\
\ \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n\
\ \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41675977653631285,\n\
\ \"acc_stderr\": 0.016489134962438954,\n \"acc_norm\": 0.41675977653631285,\n\
\ \"acc_norm_stderr\": 0.016489134962438954\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.026568921015457155,\n\
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.026568921015457155\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n\
\ \"acc_stderr\": 0.026664410886937624,\n \"acc_norm\": 0.6720257234726688,\n\
\ \"acc_norm_stderr\": 0.026664410886937624\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799215,\n\
\ \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799215\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4485006518904824,\n\
\ \"acc_stderr\": 0.012702317490559807,\n \"acc_norm\": 0.4485006518904824,\n\
\ \"acc_norm_stderr\": 0.012702317490559807\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n\
\ \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6356209150326797,\n \"acc_stderr\": 0.0194695182215737,\n \
\ \"acc_norm\": 0.6356209150326797,\n \"acc_norm_stderr\": 0.0194695182215737\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.689795918367347,\n \"acc_stderr\": 0.029613459872484378,\n\
\ \"acc_norm\": 0.689795918367347,\n \"acc_norm_stderr\": 0.029613459872484378\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421606,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421606\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3990208078335373,\n\
\ \"mc1_stderr\": 0.017142825728496767,\n \"mc2\": 0.5665172743251056,\n\
\ \"mc2_stderr\": 0.015333488256500133\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7616416732438832,\n \"acc_stderr\": 0.011974948667702316\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3904473085670963,\n \
\ \"acc_stderr\": 0.013437829864668575\n }\n}\n```"
repo_url: https://huggingface.co/jeiku/Soulful_Bepis_7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|arc:challenge|25_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|gsm8k|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hellaswag|10_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T08-34-30.661072.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-04T08-34-30.661072.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- '**/details_harness|winogrande|5_2024-03-04T08-34-30.661072.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-04T08-34-30.661072.parquet'
- config_name: results
data_files:
- split: 2024_03_04T08_34_30.661072
path:
- results_2024-03-04T08-34-30.661072.parquet
- split: latest
path:
- results_2024-03-04T08-34-30.661072.parquet
---
# Dataset Card for Evaluation run of jeiku/Soulful_Bepis_7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jeiku/Soulful_Bepis_7B](https://huggingface.co/jeiku/Soulful_Bepis_7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jeiku__Soulful_Bepis_7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-04T08:34:30.661072](https://huggingface.co/datasets/open-llm-leaderboard/details_jeiku__Soulful_Bepis_7B/blob/main/results_2024-03-04T08-34-30.661072.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6233219424488493,
"acc_stderr": 0.03282744237626033,
"acc_norm": 0.6286298883234799,
"acc_norm_stderr": 0.03349039401734607,
"mc1": 0.3990208078335373,
"mc1_stderr": 0.017142825728496767,
"mc2": 0.5665172743251056,
"mc2_stderr": 0.015333488256500133
},
"harness|arc:challenge|25": {
"acc": 0.6092150170648464,
"acc_stderr": 0.01425856388051378,
"acc_norm": 0.6382252559726962,
"acc_norm_stderr": 0.014041957945038076
},
"harness|hellaswag|10": {
"acc": 0.617307309300936,
"acc_stderr": 0.0048505089451160895,
"acc_norm": 0.8069109739095798,
"acc_norm_stderr": 0.003939155484500653
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03782728980865469,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03782728980865469
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.03800968060554859,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.03800968060554859
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7419354838709677,
"acc_stderr": 0.02489246917246283,
"acc_norm": 0.7419354838709677,
"acc_norm_stderr": 0.02489246917246283
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659356,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659356
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.02469721693087894,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.02469721693087894
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.0386155754625517,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.0386155754625517
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.01626567563201034,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.01626567563201034
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251735,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251735
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709696,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709696
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.04453197507374984,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.04453197507374984
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41675977653631285,
"acc_stderr": 0.016489134962438954,
"acc_norm": 0.41675977653631285,
"acc_norm_stderr": 0.016489134962438954
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.026568921015457155,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.026568921015457155
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.026664410886937624,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.026664410886937624
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.025006469755799215,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.025006469755799215
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4485006518904824,
"acc_stderr": 0.012702317490559807,
"acc_norm": 0.4485006518904824,
"acc_norm_stderr": 0.012702317490559807
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6356209150326797,
"acc_stderr": 0.0194695182215737,
"acc_norm": 0.6356209150326797,
"acc_norm_stderr": 0.0194695182215737
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.689795918367347,
"acc_stderr": 0.029613459872484378,
"acc_norm": 0.689795918367347,
"acc_norm_stderr": 0.029613459872484378
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421606,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421606
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3990208078335373,
"mc1_stderr": 0.017142825728496767,
"mc2": 0.5665172743251056,
"mc2_stderr": 0.015333488256500133
},
"harness|winogrande|5": {
"acc": 0.7616416732438832,
"acc_stderr": 0.011974948667702316
},
"harness|gsm8k|5": {
"acc": 0.3904473085670963,
"acc_stderr": 0.013437829864668575
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_TehVenom__Pygmalion_AlpacaLora-7b | ---
pretty_name: Evaluation run of TehVenom/Pygmalion_AlpacaLora-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TehVenom/Pygmalion_AlpacaLora-7b](https://huggingface.co/TehVenom/Pygmalion_AlpacaLora-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TehVenom__Pygmalion_AlpacaLora-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-18T13:29:21.938536](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__Pygmalion_AlpacaLora-7b/blob/main/results_2023-10-18T13-29-21.938536.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.25817953020134227,\n\
\ \"em_stderr\": 0.004481774083922211,\n \"f1\": 0.3091002516778529,\n\
\ \"f1_stderr\": 0.0044655983468890985,\n \"acc\": 0.367154387965818,\n\
\ \"acc_stderr\": 0.007802106213381273\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.25817953020134227,\n \"em_stderr\": 0.004481774083922211,\n\
\ \"f1\": 0.3091002516778529,\n \"f1_stderr\": 0.0044655983468890985\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.012130401819560273,\n \
\ \"acc_stderr\": 0.003015294242890945\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7221783741120757,\n \"acc_stderr\": 0.012588918183871601\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TehVenom/Pygmalion_AlpacaLora-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_18T13_29_21.938536
path:
- '**/details_harness|drop|3_2023-10-18T13-29-21.938536.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-18T13-29-21.938536.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_18T13_29_21.938536
path:
- '**/details_harness|gsm8k|5_2023-10-18T13-29-21.938536.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-18T13-29-21.938536.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:17:50.932996.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:17:50.932996.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:17:50.932996.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_18T13_29_21.938536
path:
- '**/details_harness|winogrande|5_2023-10-18T13-29-21.938536.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-18T13-29-21.938536.parquet'
- config_name: results
data_files:
- split: 2023_07_19T16_17_50.932996
path:
- results_2023-07-19T16:17:50.932996.parquet
- split: 2023_10_18T13_29_21.938536
path:
- results_2023-10-18T13-29-21.938536.parquet
- split: latest
path:
- results_2023-10-18T13-29-21.938536.parquet
---
# Dataset Card for Evaluation run of TehVenom/Pygmalion_AlpacaLora-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TehVenom/Pygmalion_AlpacaLora-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TehVenom/Pygmalion_AlpacaLora-7b](https://huggingface.co/TehVenom/Pygmalion_AlpacaLora-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TehVenom__Pygmalion_AlpacaLora-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T13:29:21.938536](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__Pygmalion_AlpacaLora-7b/blob/main/results_2023-10-18T13-29-21.938536.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.25817953020134227,
"em_stderr": 0.004481774083922211,
"f1": 0.3091002516778529,
"f1_stderr": 0.0044655983468890985,
"acc": 0.367154387965818,
"acc_stderr": 0.007802106213381273
},
"harness|drop|3": {
"em": 0.25817953020134227,
"em_stderr": 0.004481774083922211,
"f1": 0.3091002516778529,
"f1_stderr": 0.0044655983468890985
},
"harness|gsm8k|5": {
"acc": 0.012130401819560273,
"acc_stderr": 0.003015294242890945
},
"harness|winogrande|5": {
"acc": 0.7221783741120757,
"acc_stderr": 0.012588918183871601
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
theojiang/contrastive_conditional_vid_diff_std_1_6_webvid-train | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: edited_image
dtype: image
splits:
- name: train
num_bytes: 13923596850.0
num_examples: 400000
download_size: 12614420310
dataset_size: 13923596850.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
distilled-from-one-sec-cv12/chunk_91 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1280336492
num_examples: 249481
download_size: 1308040400
dataset_size: 1280336492
---
# Dataset Card for "chunk_91"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/florence_nightingale_santa_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of florence_nightingale_santa/ナイチンゲール〔サンタ〕/南丁格尔〔圣诞〕 (Fate/Grand Order)
This is the dataset of florence_nightingale_santa/ナイチンゲール〔サンタ〕/南丁格尔〔圣诞〕 (Fate/Grand Order), containing 500 images and their tags.
The core tags of this character are `long_hair, pink_hair, red_eyes, breasts, large_breasts, braid`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 725.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/florence_nightingale_santa_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 631.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/florence_nightingale_santa_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1225 | 1.16 GiB | [Download](https://huggingface.co/datasets/CyberHarem/florence_nightingale_santa_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/florence_nightingale_santa_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, belt, black_skirt, long_sleeves, looking_at_viewer, military_uniform, red_jacket, solo, white_gloves, pleated_skirt, white_pantyhose, boots |
| 1 | 10 |  |  |  |  |  | 1girl, belt, black_skirt, long_sleeves, military_uniform, solo, white_gloves, white_pantyhose, looking_at_viewer, pleated_skirt, red_jacket, strap_between_breasts, simple_background, white_background, folded_ponytail, braided_ponytail, cowboy_shot |
| 2 | 11 |  |  |  |  |  | 1girl, military_uniform, solo, white_gloves, belt, holding_gun, looking_at_viewer, black_skirt, pleated_skirt, handgun, strap_between_breasts, long_sleeves, white_pantyhose, red_jacket |
| 3 | 5 |  |  |  |  |  | 1girl, bandage_over_one_eye, belt, black_coat, black_skirt, coat_on_shoulders, handgun, looking_at_viewer, military_uniform, pleated_skirt, red_jacket, solo, white_gloves, long_sleeves, strap_between_breasts, bandages, parted_lips, white_pantyhose |
| 4 | 6 |  |  |  |  |  | 1girl, bandage_over_one_eye, belt, black_coat, black_skirt, coat_on_shoulders, long_sleeves, looking_at_viewer, military_uniform, pleated_skirt, red_jacket, solo, strap_between_breasts, white_gloves, bag, bandages, closed_mouth, pantyhose, fur-trimmed_sleeves, gun, white_background |
| 5 | 5 |  |  |  |  |  | 1girl, black_coat, coat_on_shoulders, long_sleeves, looking_at_viewer, military_uniform, red_jacket, solo, white_gloves, bandage_over_one_eye, belt, black_skirt, bandages, strap_between_breasts |
| 6 | 6 |  |  |  |  |  | 1girl, bandage_over_one_eye, bandages, coat_on_shoulders, long_sleeves, military_uniform, red_jacket, solo, upper_body, white_gloves, black_coat, closed_mouth, looking_at_viewer, jacket_on_shoulders, adjusting_gloves, simple_background, white_background |
| 7 | 17 |  |  |  |  |  | 1girl, military_uniform, solo, white_gloves, looking_at_viewer, upper_body, red_jacket, simple_background, closed_mouth, long_sleeves, white_background, adjusting_gloves |
| 8 | 15 |  |  |  |  |  | 1girl, looking_at_viewer, santa_costume, santa_hat, solo, fur-trimmed_sleeves, white_gloves, christmas, long_sleeves, pantyhose, bow, fur-trimmed_headwear, red_headwear, very_long_hair, skirt, holding_gun, red_jacket |
| 9 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, solo, formal, jacket, suit, white_shirt, black_necktie, black_gloves, collared_shirt, coat_on_shoulders, long_sleeves, pantyhose, skirt, weapon |
| 10 | 22 |  |  |  |  |  | 1girl, green_bikini, layered_bikini, official_alternate_costume, green_gloves, green_thighhighs, revealing_clothes, shrug_(clothing), looking_at_viewer, nurse_cap, purple_bikini, thighhighs_under_boots, short_sleeves, solo, thigh_boots, black_footwear, navel, black_headwear, cleavage, microskirt, purple_belt, miniskirt, side-tie_bikini_bottom, single_braid, thighs, black_skirt, blush, clipboard, garter_straps, between_fingers, elbow_gloves, folded_ponytail, holding_syringe |
| 11 | 5 |  |  |  |  |  | 1girl, folded_ponytail, looking_at_viewer, navel, side-tie_bikini_bottom, solo, yellow_bikini, cleavage, bare_shoulders, blush, collarbone, smile, underboob, closed_mouth, covered_nipples, holding, parted_lips, sun_hat, whistle, white_background, white_headwear |
| 12 | 6 |  |  |  |  |  | 1girl, collarbone, looking_at_viewer, nipples, solo, completely_nude, blush, folded_ponytail, navel, closed_mouth, parted_lips, pussy, uncensored |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | belt | black_skirt | long_sleeves | looking_at_viewer | military_uniform | red_jacket | solo | white_gloves | pleated_skirt | white_pantyhose | boots | strap_between_breasts | simple_background | white_background | folded_ponytail | braided_ponytail | cowboy_shot | holding_gun | handgun | bandage_over_one_eye | black_coat | coat_on_shoulders | bandages | parted_lips | bag | closed_mouth | pantyhose | fur-trimmed_sleeves | gun | upper_body | jacket_on_shoulders | adjusting_gloves | santa_costume | santa_hat | christmas | bow | fur-trimmed_headwear | red_headwear | very_long_hair | skirt | formal | jacket | suit | white_shirt | black_necktie | black_gloves | collared_shirt | weapon | green_bikini | layered_bikini | official_alternate_costume | green_gloves | green_thighhighs | revealing_clothes | shrug_(clothing) | nurse_cap | purple_bikini | thighhighs_under_boots | short_sleeves | thigh_boots | black_footwear | navel | black_headwear | cleavage | microskirt | purple_belt | miniskirt | side-tie_bikini_bottom | single_braid | thighs | blush | clipboard | garter_straps | between_fingers | elbow_gloves | holding_syringe | yellow_bikini | bare_shoulders | collarbone | smile | underboob | covered_nipples | holding | sun_hat | whistle | white_headwear | nipples | completely_nude | pussy | uncensored |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-------|:--------------|:---------------|:--------------------|:-------------------|:-------------|:-------|:---------------|:----------------|:------------------|:--------|:------------------------|:--------------------|:-------------------|:------------------|:-------------------|:--------------|:--------------|:----------|:-----------------------|:-------------|:--------------------|:-----------|:--------------|:------|:---------------|:------------|:----------------------|:------|:-------------|:----------------------|:-------------------|:----------------|:------------|:------------|:------|:-----------------------|:---------------|:-----------------|:--------|:---------|:---------|:-------|:--------------|:----------------|:---------------|:-----------------|:---------|:---------------|:-----------------|:-----------------------------|:---------------|:-------------------|:--------------------|:-------------------|:------------|:----------------|:-------------------------|:----------------|:--------------|:-----------------|:--------|:-----------------|:-----------|:-------------|:--------------|:------------|:-------------------------|:---------------|:---------|:--------|:------------|:----------------|:------------------|:---------------|:------------------|:----------------|:-----------------|:-------------|:--------|:------------|:------------------|:----------|:----------|:----------|:-----------------|:----------|:------------------|:--------|:-------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | X | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | X | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | X | | X | | | | | | X | X | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | X | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | | X | X | X | X | X | X | | | | | X | X | | | | | | X | X | X | X | | | X | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 17 |  |  |  |  |  | X | | | X | X | X | X | X | X | | | | | X | X | | | | | | | | | | | | X | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 15 |  |  |  |  |  | X | | | X | X | | X | X | X | | | | | | | | | | X | | | | | | | | | X | X | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 10 |  |  |  |  |  | X | | | X | X | | | X | | | | | | | | | | | | | | | X | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 22 |  |  |  |  |  | X | | X | | X | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 11 | 5 |  |  |  |  |  | X | | | | X | | | X | | | | | | | X | X | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | | | X | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | | | | |
| 12 | 6 |  |  |  |  |  | X | | | | X | | | X | | | | | | | | X | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | | | | | | | | X | | | | | | | | X | X | X | X |
|
kinianlo/MMTS | ---
dataset_info:
- config_name: laion2B-en-words-count
features:
- name: count
dtype: int64
- name: word
dtype: string
splits:
- name: train
num_bytes: 2040588603
num_examples: 91658096
download_size: 1365127988
dataset_size: 2040588603
- config_name: shakespeare_laion2B-en_words
features:
- name: word
dtype: string
- name: word_lemma
dtype: string
- name: tag
dtype: string
- name: count_corpus_tag
dtype: int64
- name: count_corpus
dtype: int64
- name: count_laion2B-en
dtype: int64
- name: is_physical_entity
dtype: bool
- name: concreteness
dtype: float64
- name: concreteness_lemma
dtype: float64
splits:
- name: train
num_bytes: 1244660
num_examples: 18548
download_size: 0
dataset_size: 1244660
- config_name: shakespeare_words
features:
- name: word
dtype: string
- name: count_corpus
dtype: int64
- name: count_laion2B-en
dtype: int64
splits:
- name: train
num_bytes: 309689
num_examples: 11456
download_size: 193309
dataset_size: 309689
configs:
- config_name: laion2B-en-words-count
data_files:
- split: train
path: laion2B-en-words-count/train-*
- config_name: shakespeare_laion2B-en_words
data_files:
- split: train
path: shakespeare_laion2B-en_words/train-*
- config_name: shakespeare_words
data_files:
- split: train
path: shakespeare_words/train-*
---
# Dataset Card for "MMTS"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mnoukhov/test | ---
dataset_info:
features:
- name: foo
dtype: int64
splits:
- name: train
num_bytes: 24
num_examples: 3
download_size: 843
dataset_size: 24
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_liminerity__Blur-7b-v1.22 | ---
pretty_name: Evaluation run of liminerity/Blur-7b-v1.22
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [liminerity/Blur-7b-v1.22](https://huggingface.co/liminerity/Blur-7b-v1.22) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_liminerity__Blur-7b-v1.22\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-18T14:27:00.815176](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__Blur-7b-v1.22/blob/main/results_2024-01-18T14-27-00.815176.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5792659642890636,\n\
\ \"acc_stderr\": 0.033736595862772584,\n \"acc_norm\": 0.5837704661411739,\n\
\ \"acc_norm_stderr\": 0.03445011469626218,\n \"mc1\": 0.5128518971848225,\n\
\ \"mc1_stderr\": 0.01749771794429982,\n \"mc2\": 0.6795713154043607,\n\
\ \"mc2_stderr\": 0.01513714146837095\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5981228668941979,\n \"acc_stderr\": 0.014327268614578276,\n\
\ \"acc_norm\": 0.621160409556314,\n \"acc_norm_stderr\": 0.014175915490000324\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6389165504879506,\n\
\ \"acc_stderr\": 0.004793330525656209,\n \"acc_norm\": 0.8208524198366859,\n\
\ \"acc_norm_stderr\": 0.003826921299075399\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.6754716981132075,\n \"acc_stderr\": 0.028815615713432115,\n \
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.028815615713432115\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n\
\ \"acc_stderr\": 0.03772446857518027,\n \"acc_norm\": 0.5722543352601156,\n\
\ \"acc_norm_stderr\": 0.03772446857518027\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.032671518489247764,\n\
\ \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.032671518489247764\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3492063492063492,\n \"acc_stderr\": 0.024552292209342654,\n \"\
acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.024552292209342654\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6774193548387096,\n\
\ \"acc_stderr\": 0.026593084516572277,\n \"acc_norm\": 0.6774193548387096,\n\
\ \"acc_norm_stderr\": 0.026593084516572277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.03713158067481913,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.03713158067481913\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7121212121212122,\n \"acc_stderr\": 0.03225883512300992,\n \"\
acc_norm\": 0.7121212121212122,\n \"acc_norm_stderr\": 0.03225883512300992\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.02897908979429673,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.02897908979429673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5435897435897435,\n \"acc_stderr\": 0.025254485424799602,\n\
\ \"acc_norm\": 0.5435897435897435,\n \"acc_norm_stderr\": 0.025254485424799602\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857416,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857416\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.031866081214088314,\n\
\ \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.031866081214088314\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7669724770642202,\n \"acc_stderr\": 0.01812566918086149,\n \"\
acc_norm\": 0.7669724770642202,\n \"acc_norm_stderr\": 0.01812566918086149\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321617,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321617\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7352941176470589,\n \"acc_stderr\": 0.030964517926923393,\n \"\
acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.030964517926923393\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6751054852320675,\n \"acc_stderr\": 0.030486039389105296,\n \
\ \"acc_norm\": 0.6751054852320675,\n \"acc_norm_stderr\": 0.030486039389105296\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n\
\ \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.6322869955156951,\n\
\ \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.042607351576445594,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.042607351576445594\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6611570247933884,\n \"acc_stderr\": 0.04320767807536671,\n \"\
acc_norm\": 0.6611570247933884,\n \"acc_norm_stderr\": 0.04320767807536671\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n\
\ \"acc_stderr\": 0.04524596007030048,\n \"acc_norm\": 0.6759259259259259,\n\
\ \"acc_norm_stderr\": 0.04524596007030048\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n\
\ \"acc_stderr\": 0.04726835553719097,\n \"acc_norm\": 0.5446428571428571,\n\
\ \"acc_norm_stderr\": 0.04726835553719097\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.041858325989283136,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.041858325989283136\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.02441494730454368,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.02441494730454368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7509578544061303,\n\
\ \"acc_stderr\": 0.015464676163395962,\n \"acc_norm\": 0.7509578544061303,\n\
\ \"acc_norm_stderr\": 0.015464676163395962\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.026226158605124655,\n\
\ \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.026226158605124655\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38212290502793295,\n\
\ \"acc_stderr\": 0.016251139711570762,\n \"acc_norm\": 0.38212290502793295,\n\
\ \"acc_norm_stderr\": 0.016251139711570762\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6241830065359477,\n \"acc_stderr\": 0.02773283435336393,\n\
\ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.02773283435336393\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n\
\ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n\
\ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6419753086419753,\n \"acc_stderr\": 0.02667561192603711,\n\
\ \"acc_norm\": 0.6419753086419753,\n \"acc_norm_stderr\": 0.02667561192603711\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40425531914893614,\n \"acc_stderr\": 0.029275532159704725,\n \
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.029275532159704725\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41460234680573665,\n\
\ \"acc_stderr\": 0.012582597058908284,\n \"acc_norm\": 0.41460234680573665,\n\
\ \"acc_norm_stderr\": 0.012582597058908284\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5367647058823529,\n \"acc_stderr\": 0.03029061918048569,\n\
\ \"acc_norm\": 0.5367647058823529,\n \"acc_norm_stderr\": 0.03029061918048569\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5620915032679739,\n \"acc_stderr\": 0.020071257886886525,\n \
\ \"acc_norm\": 0.5620915032679739,\n \"acc_norm_stderr\": 0.020071257886886525\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.03086214492108756,\n\
\ \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.03086214492108756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7611940298507462,\n\
\ \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.7611940298507462,\n\
\ \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03126781714663179,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03126781714663179\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5128518971848225,\n\
\ \"mc1_stderr\": 0.01749771794429982,\n \"mc2\": 0.6795713154043607,\n\
\ \"mc2_stderr\": 0.01513714146837095\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7868981846882399,\n \"acc_stderr\": 0.011508957690722762\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.310841546626232,\n \
\ \"acc_stderr\": 0.012748860507777727\n }\n}\n```"
repo_url: https://huggingface.co/liminerity/Blur-7b-v1.22
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|arc:challenge|25_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|arc:challenge|25_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|gsm8k|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|gsm8k|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hellaswag|10_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hellaswag|10_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T14-15-07.987352.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T14-27-00.815176.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T14-27-00.815176.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- '**/details_harness|winogrande|5_2024-01-18T14-15-07.987352.parquet'
- split: 2024_01_18T14_27_00.815176
path:
- '**/details_harness|winogrande|5_2024-01-18T14-27-00.815176.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-18T14-27-00.815176.parquet'
- config_name: results
data_files:
- split: 2024_01_18T14_15_07.987352
path:
- results_2024-01-18T14-15-07.987352.parquet
- split: 2024_01_18T14_27_00.815176
path:
- results_2024-01-18T14-27-00.815176.parquet
- split: latest
path:
- results_2024-01-18T14-27-00.815176.parquet
---
# Dataset Card for Evaluation run of liminerity/Blur-7b-v1.22
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [liminerity/Blur-7b-v1.22](https://huggingface.co/liminerity/Blur-7b-v1.22) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_liminerity__Blur-7b-v1.22",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T14:27:00.815176](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__Blur-7b-v1.22/blob/main/results_2024-01-18T14-27-00.815176.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5792659642890636,
"acc_stderr": 0.033736595862772584,
"acc_norm": 0.5837704661411739,
"acc_norm_stderr": 0.03445011469626218,
"mc1": 0.5128518971848225,
"mc1_stderr": 0.01749771794429982,
"mc2": 0.6795713154043607,
"mc2_stderr": 0.01513714146837095
},
"harness|arc:challenge|25": {
"acc": 0.5981228668941979,
"acc_stderr": 0.014327268614578276,
"acc_norm": 0.621160409556314,
"acc_norm_stderr": 0.014175915490000324
},
"harness|hellaswag|10": {
"acc": 0.6389165504879506,
"acc_stderr": 0.004793330525656209,
"acc_norm": 0.8208524198366859,
"acc_norm_stderr": 0.003826921299075399
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.028815615713432115,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.028815615713432115
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.03772446857518027,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.03772446857518027
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.024552292209342654,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.024552292209342654
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795132,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795132
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6774193548387096,
"acc_stderr": 0.026593084516572277,
"acc_norm": 0.6774193548387096,
"acc_norm_stderr": 0.026593084516572277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.03713158067481913,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.03713158067481913
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7121212121212122,
"acc_stderr": 0.03225883512300992,
"acc_norm": 0.7121212121212122,
"acc_norm_stderr": 0.03225883512300992
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5435897435897435,
"acc_stderr": 0.025254485424799602,
"acc_norm": 0.5435897435897435,
"acc_norm_stderr": 0.025254485424799602
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857416,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7669724770642202,
"acc_stderr": 0.01812566918086149,
"acc_norm": 0.7669724770642202,
"acc_norm_stderr": 0.01812566918086149
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.030964517926923393,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.030964517926923393
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6751054852320675,
"acc_stderr": 0.030486039389105296,
"acc_norm": 0.6751054852320675,
"acc_norm_stderr": 0.030486039389105296
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.042607351576445594,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.042607351576445594
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6611570247933884,
"acc_stderr": 0.04320767807536671,
"acc_norm": 0.6611570247933884,
"acc_norm_stderr": 0.04320767807536671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.04524596007030048,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.04524596007030048
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5446428571428571,
"acc_stderr": 0.04726835553719097,
"acc_norm": 0.5446428571428571,
"acc_norm_stderr": 0.04726835553719097
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.041858325989283136,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.041858325989283136
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.02441494730454368,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.02441494730454368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7509578544061303,
"acc_stderr": 0.015464676163395962,
"acc_norm": 0.7509578544061303,
"acc_norm_stderr": 0.015464676163395962
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.026226158605124655,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.026226158605124655
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38212290502793295,
"acc_stderr": 0.016251139711570762,
"acc_norm": 0.38212290502793295,
"acc_norm_stderr": 0.016251139711570762
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.02773283435336393,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.02773283435336393
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6419753086419753,
"acc_stderr": 0.02667561192603711,
"acc_norm": 0.6419753086419753,
"acc_norm_stderr": 0.02667561192603711
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.029275532159704725,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.029275532159704725
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41460234680573665,
"acc_stderr": 0.012582597058908284,
"acc_norm": 0.41460234680573665,
"acc_norm_stderr": 0.012582597058908284
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5367647058823529,
"acc_stderr": 0.03029061918048569,
"acc_norm": 0.5367647058823529,
"acc_norm_stderr": 0.03029061918048569
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5620915032679739,
"acc_stderr": 0.020071257886886525,
"acc_norm": 0.5620915032679739,
"acc_norm_stderr": 0.020071257886886525
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.03086214492108756,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.03086214492108756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7611940298507462,
"acc_stderr": 0.03014777593540922,
"acc_norm": 0.7611940298507462,
"acc_norm_stderr": 0.03014777593540922
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03126781714663179,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03126781714663179
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5128518971848225,
"mc1_stderr": 0.01749771794429982,
"mc2": 0.6795713154043607,
"mc2_stderr": 0.01513714146837095
},
"harness|winogrande|5": {
"acc": 0.7868981846882399,
"acc_stderr": 0.011508957690722762
},
"harness|gsm8k|5": {
"acc": 0.310841546626232,
"acc_stderr": 0.012748860507777727
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
distilled-from-one-sec-cv12/chunk_138 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1136702076
num_examples: 221493
download_size: 1162374323
dataset_size: 1136702076
---
# Dataset Card for "chunk_138"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
marcones/pedrohenriqueespen | ---
license: openrail
---
|
growth-cadet/eval_gpt4_jobpost_treshold80 | ---
dataset_info:
features:
- name: id
dtype: string
- name: ats
dtype: string
- name: context
dtype: string
- name: context_token_count
dtype: int64
- name: gpt-4_response
dtype: string
- name: gpt-4_cost
dtype: float64
- name: gpt-4_sys5_response
dtype: string
- name: gpt-4_sys5_cost
dtype: float64
- name: sys5_obj
struct:
- name: focus_areas
list:
- name: description
dtype: string
- name: subject
dtype: string
- name: industries
list:
- name: description
dtype: string
- name: subject
dtype: string
- name: products_and_technologies
list:
- name: description
dtype: string
- name: subject
dtype: string
- name: mistral01_gen
dtype: string
- name: eval_crit
struct:
- name: focus_areas
dtype: float64
- name: industries
dtype: float64
- name: products_and_technologies
dtype: float64
- name: eval_values
struct:
- name: focus_areas
sequence: int64
- name: industries
sequence: int64
- name: products_and_technologies
sequence: int64
splits:
- name: train
num_bytes: 11546785.45346062
num_examples: 1320
- name: test
num_bytes: 7830305.560354374
num_examples: 746
download_size: 9571648
dataset_size: 19377091.013814993
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
NBayer/papers_text_2_summary | ---
license: openrail
---
|
jacobbieker/era5-6hour-val | ---
license: mit
---
|
HiTZ/EusTrivia | ---
task_categories:
- question-answering
language:
- eu
pretty_name: TriviaEus
size_categories:
- 1K<n<10K
configs:
- config_name: default
data_files:
- split: test
path: "triviaeus.jsonl"
---
# Dataset Card for EusTrivia
EusTrivia consists of 1,715 trivia questions from multiple online sources. 56.3\% of the questions are elementary level (grades 3-6), while the rest are considered challenging. A significant portion of the questions focus specifically on the Basque Country, its language and culture. Each multiple-choice question contains two, three or four choices (3.84 on average) and a single correct answer. Five areas of knowledge are covered:
- **Humanities and Natural Sciences** (27.8%): This category encompasses questions about history, geography, biology, ecology and other social and natural sciences.
- **Leisure and Art** (24.5%): This category includes questions on sports and athletes, performative and plastic arts and artists, architecture, cultural events, and related topics.
- **Music** (16.0%): Here are grouped all the questions about music and musicians, both classical and contemporary.
- **Language and Literature** (17.1%): This category is concerned with all kinds of literature productions and writers, as well as metalinguistic questions (e.g., definitions, synonyms, and word usage).
- **Mathematics and ICT** (14.5%): This category covers mathematical problems and questions about ICT, as well as questions about people known for their contributions to these fields of knowledge.
- **Curated by:** HiTZ Research Center & IXA Research group (University of the Basque Country UPV/EHU)
- **Language(s) (NLP):** Basque (eu)
- 📒 Blog Post: [Latxa: An Open Language Model and Evaluation Suite for Basque](https://www.hitz.eus/en/node/340)
- 📖 Paper: [Latxa: An Open Language Model and Evaluation Suite for Basque](https://arxiv.org/abs/2403.20266)
- 💻 Code: [hitz-zentroa/latxa](https://github.com/hitz-zentroa/latxa)
- 📧 Contact: [hitz@ehu.eus](mailto:hitz@ehu.eus)
## Example
Basque Example:
```txt
Galdera: Zenbat kilo dauka tona batek?
A. 10.000 kilo
B. 1.000.000 kilo
C. 1.000 kilo
D. 100 kilo
Erantzuna: C
```
English Translation:
```txt
Question: How many kilograms are there in a tonne?
A. 10,000 kilos
B. 1,000,000 kilos
C. 1,000 kilos
D. 100 kilos
Answer: C
```
## Citation
```bibtex
@misc{etxaniz2024latxa,
title={{L}atxa: An Open Language Model and Evaluation Suite for {B}asque},
author={Julen Etxaniz and Oscar Sainz and Naiara Perez and Itziar Aldabe and German Rigau and Eneko Agirre and Aitor Ormazabal and Mikel Artetxe and Aitor Soroa},
year={2024},
eprint={2403.20266},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
bskang/CVPR2023_title_abstract_intro_300 | ---
dataset_info:
features:
- name: title
dtype: string
- name: abstract
dtype: string
- name: introduction
dtype: string
splits:
- name: train
num_bytes: 2413669
num_examples: 300
download_size: 1286131
dataset_size: 2413669
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "CVPR2023_title_abstract_intro_300"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
roa7n/patched_test_p_40_f_SPOUT_v4 | ---
dataset_info:
features:
- name: id
dtype: string
- name: sequence_str
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 478745882
num_examples: 1470999
download_size: 0
dataset_size: 478745882
---
# Dataset Card for "patched_test_p_40_f_SPOUT_v4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Predict9731/voxpopuli_windows_cs | ---
dataset_info:
features:
- name: audio_id
dtype: string
- name: language
dtype:
class_label:
names:
'0': en
'1': de
'2': fr
'3': es
'4': pl
'5': it
'6': ro
'7': hu
'8': cs
'9': nl
'10': fi
'11': hr
'12': sk
'13': sl
'14': et
'15': lt
'16': en_accented
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: raw_text
dtype: string
- name: normalized_text
dtype: string
- name: gender
dtype: string
- name: speaker_id
dtype: string
- name: is_gold_transcript
dtype: bool
- name: accent
dtype: string
splits:
- name: train
num_bytes: 6549063392.628
num_examples: 18902
download_size: 10449462424
dataset_size: 6549063392.628
---
# Dataset Card for "voxpopuli_windows_cs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_cola_present_for_neutral_future | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 3458
num_examples: 40
- name: test
num_bytes: 3439
num_examples: 40
- name: train
num_bytes: 25860
num_examples: 339
download_size: 20082
dataset_size: 32757
---
# Dataset Card for "MULTI_VALUE_cola_present_for_neutral_future"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vwxyzjn/summarize_from_feedback_tldr_3_filtered_oai_preprocessing_pythia-160m_53 | ---
dataset_info:
features:
- name: id
dtype: string
- name: subreddit
dtype: string
- name: title
dtype: string
- name: post
dtype: string
- name: summary
dtype: string
- name: query_token
sequence: int64
- name: query
dtype: string
- name: reference_response
dtype: string
- name: reference_response_token
sequence: int64
- name: reference_response_token_len
dtype: int64
- name: query_reference_response
dtype: string
- name: query_reference_response_token
sequence: int64
- name: query_reference_response_token_len
dtype: int64
splits:
- name: train
num_bytes: 1600440249
num_examples: 116722
- name: validation
num_bytes: 88425771
num_examples: 6447
- name: test
num_bytes: 89922466
num_examples: 6553
download_size: 551824801
dataset_size: 1778788486
---
# TL;DR SFT Dataset for OpenAI's [Summarize from Feedback](https://openai.com/blog/summarization/) task
The dataset is directly taken from https://github.com/openai/summarize-from-feedback/tree/700967448d10004279f138666442bf1497d0e705#reddit-tldr-dataset
These columns are taken directly from the aforementioned dataset:
* **id**: unique identifier for the post
* **subreddit**: subreddit the post was taken from
* **title**: title of the post
* **post**: body of the post
* **summary**: summary of the post
* **reference_response**: reference response for the post
These columns are added by this preprocessing script:
* **query**: length-limited query for summarization: OAI pre-processes the main text (title + subreddit + post), ensuring it has only 512 tokens; if the main text is too long, then it tries to truncate at the last `
`. If it's too short it pads the main text ([summarize_from_feedback/tasks.py#L98-L165](https://github.com/openai/summarize-from-feedback/blob/700967448d10004279f138666442bf1497d0e705/summarize_from_feedback/tasks.py#L98-L165)). Padding is either space or `[PAD]` token (see Args below).
* **query_token**: tokenized version of `query`
* **reference_response_token**: tokenized version of `reference_response`
* **reference_response_token_len**: length of `reference_response_token`
* **query_reference_response**: concatenation of `query.strip()` and `reference_response`
* **query_reference_response_token**: tokenized version of `query_reference_response`, up to `max_sft_query_response_length` tokens
* **query_reference_response_token_len**: length of `query_reference_response_token`
# Args
```python
{'base_model': 'EleutherAI/pythia-160m',
'hf_entity': 'vwxyzjn',
'max_rm_query_response_length': 638,
'max_rm_response_length': 169,
'max_sft_query_response_length': 562,
'max_sft_response_length': 53}
{'format_str': 'SUBREDDIT: r/{subreddit}\n'
'\n'
'TITLE: {title}\n'
'\n'
'POST: {post}\n'
'\n'
'TL;DR:',
'length': 512,
'pad_side': 'left',
'padding': [209],
'truncate_field': 'post',
'truncate_text': '\n'}
```
|
OKR/8888 | ---
license: apache-2.0
---
|
ibranze/araproje_arc_tr_s5 | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
sequence:
- name: text
dtype: string
- name: label
dtype: string
- name: answerKey
dtype: string
splits:
- name: validation
num_bytes: 86423.0
num_examples: 250
download_size: 46973
dataset_size: 86423.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_arc_tr_s5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_rte_degree_adj_for_adv | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 4837
num_examples: 12
- name: train
num_bytes: 5343
num_examples: 12
download_size: 16693
dataset_size: 10180
---
# Dataset Card for "MULTI_VALUE_rte_degree_adj_for_adv"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/med_alpaca_standardized_cluster_97 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 65415362
num_examples: 6980
download_size: 17640548
dataset_size: 65415362
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_97"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
scukdde-llm/alpaca-law | ---
license: apache-2.0
---
|
zolak/twitter_dataset_50_1713214770 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 1427201
num_examples: 3487
download_size: 718158
dataset_size: 1427201
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
showchen/Amiya | ---
license: apache-2.0
---
|
liuyanchen1015/MULTI_VALUE_sst2_a_participle | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 34412
num_examples: 227
- name: test
num_bytes: 79710
num_examples: 508
- name: train
num_bytes: 1263901
num_examples: 11200
download_size: 809616
dataset_size: 1378023
---
# Dataset Card for "MULTI_VALUE_sst2_a_participle"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arihant-neohumans/omegle_data | ---
license: mit
---
|
JeunesseAfricaine/my_tweets | ---
license: apache-2.0
---
|
mber/subset_squadv2_format_date | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: train
num_bytes: 2646337.9600498625
num_examples: 2847
- name: validation
num_bytes: 4913.35830947971
num_examples: 5
download_size: 4265888
dataset_size: 2651251.3183593424
---
# Dataset Card for "subset_squadv2_format_date"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/results | ---
language:
- en
---

# Open LLM Leaderboard Results
This repository contains the outcomes of your submitted models that have been evaluated through the Open LLM Leaderboard. Our goal is to shed light on the cutting-edge Large Language Models (LLMs) and chatbots, enabling you to make well-informed decisions regarding your chosen application.
## Evaluation Methodology
The evaluation process involves running your models against several benchmarks from the Eleuther AI Harness, a unified framework for measuring the effectiveness of generative language models. Below is a brief overview of each benchmark:
1. AI2 Reasoning Challenge (ARC) - Grade-School Science Questions (25-shot)
2. HellaSwag - Commonsense Inference (10-shot)
3. MMLU - Massive Multi-Task Language Understanding, knowledge on 57 domains (5-shot)
4. TruthfulQA - Propensity to Produce Falsehoods (0-shot)
5. Winogrande - Adversarial Winograd Schema Challenge (5-shot)
6. GSM8k - Grade School Math Word Problems Solving Complex Mathematical Reasoning (5-shot)
Together, these benchmarks provide an assessment of a model's capabilities in terms of knowledge, reasoning, and some math, in various scenarios.
## Exploring Model Details
For further insights into the inputs and outputs of specific models, locate the "📄" emoji associated with the desired model in the leaderboard. Clicking on this icon will direct you to the respective GitHub page containing detailed information about the model's behavior during the evaluation process.
|
autoevaluate/autoeval-staging-eval-project-4144bd7b-94bf-4e9e-87a5-f722d28cd7cd-4745 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- emotion
eval_info:
task: multi_class_classification
model: autoevaluate/multi-class-classification
metrics: ['matthews_correlation']
dataset_name: emotion
dataset_config: default
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: autoevaluate/multi-class-classification
* Dataset: emotion
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
sana280/dataset_2 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: eval
num_bytes: 265963
num_examples: 50
download_size: 143352
dataset_size: 265963
configs:
- config_name: default
data_files:
- split: eval
path: data/eval-*
---
|
liuyanchen1015/MULTI_VALUE_qqp_existential_you_have | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 334749
num_examples: 1855
- name: test
num_bytes: 3278678
num_examples: 18476
- name: train
num_bytes: 2894888
num_examples: 16080
download_size: 3970712
dataset_size: 6508315
---
# Dataset Card for "MULTI_VALUE_qqp_existential_you_have"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_BFauber__opt1.3b_10e5 | ---
pretty_name: Evaluation run of BFauber/opt1.3b_10e5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BFauber/opt1.3b_10e5](https://huggingface.co/BFauber/opt1.3b_10e5) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__opt1.3b_10e5\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T19:22:53.886717](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt1.3b_10e5/blob/main/results_2024-02-02T19-22-53.886717.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25956926248319,\n\
\ \"acc_stderr\": 0.030774098043396373,\n \"acc_norm\": 0.26133458633537765,\n\
\ \"acc_norm_stderr\": 0.031590043549587034,\n \"mc1\": 0.22643818849449204,\n\
\ \"mc1_stderr\": 0.014651337324602578,\n \"mc2\": 0.38176130958688614,\n\
\ \"mc2_stderr\": 0.014298868021565533\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2645051194539249,\n \"acc_stderr\": 0.012889272949313368,\n\
\ \"acc_norm\": 0.295221843003413,\n \"acc_norm_stderr\": 0.013329750293382318\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4070902210714997,\n\
\ \"acc_stderr\": 0.004902878806733043,\n \"acc_norm\": 0.5280820553674567,\n\
\ \"acc_norm_stderr\": 0.004981905293878152\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n\
\ \"acc_stderr\": 0.03633384414073462,\n \"acc_norm\": 0.22962962962962963,\n\
\ \"acc_norm_stderr\": 0.03633384414073462\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03523807393012047,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03523807393012047\n \
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3018867924528302,\n \"acc_stderr\": 0.028254200344438662,\n\
\ \"acc_norm\": 0.3018867924528302,\n \"acc_norm_stderr\": 0.028254200344438662\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.043898699568087785,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.043898699568087785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.19,\n \"acc_stderr\": 0.03942772444036622,\n \"acc_norm\": 0.19,\n\
\ \"acc_norm_stderr\": 0.03942772444036622\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.225531914893617,\n \"acc_stderr\": 0.027321078417387536,\n\
\ \"acc_norm\": 0.225531914893617,\n \"acc_norm_stderr\": 0.027321078417387536\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707841,\n\
\ \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707841\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24867724867724866,\n \"acc_stderr\": 0.02226181769240017,\n \"\
acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.02226181769240017\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.038522733649243156,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.038522733649243156\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.26129032258064516,\n\
\ \"acc_stderr\": 0.024993053397764815,\n \"acc_norm\": 0.26129032258064516,\n\
\ \"acc_norm_stderr\": 0.024993053397764815\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\"\
: 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.03477691162163659,\n\
\ \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.03477691162163659\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.24242424242424243,\n \"acc_stderr\": 0.030532892233932032,\n \"\
acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.030532892233932032\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.30569948186528495,\n \"acc_stderr\": 0.033248379397581594,\n\
\ \"acc_norm\": 0.30569948186528495,\n \"acc_norm_stderr\": 0.033248379397581594\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.35128205128205126,\n \"acc_stderr\": 0.024203665177902796,\n\
\ \"acc_norm\": 0.35128205128205126,\n \"acc_norm_stderr\": 0.024203665177902796\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02730914058823018,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02730914058823018\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.31932773109243695,\n \"acc_stderr\": 0.030283995525884396,\n\
\ \"acc_norm\": 0.31932773109243695,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389024,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3155963302752294,\n \"acc_stderr\": 0.019926117513869666,\n \"\
acc_norm\": 0.3155963302752294,\n \"acc_norm_stderr\": 0.019926117513869666\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3425925925925926,\n \"acc_stderr\": 0.03236585252602157,\n \"\
acc_norm\": 0.3425925925925926,\n \"acc_norm_stderr\": 0.03236585252602157\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24509803921568626,\n \"acc_stderr\": 0.03019028245350195,\n \"\
acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.03019028245350195\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658342,\n \
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658342\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.13004484304932734,\n\
\ \"acc_stderr\": 0.02257451942417487,\n \"acc_norm\": 0.13004484304932734,\n\
\ \"acc_norm_stderr\": 0.02257451942417487\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.1984732824427481,\n \"acc_stderr\": 0.03498149385462471,\n\
\ \"acc_norm\": 0.1984732824427481,\n \"acc_norm_stderr\": 0.03498149385462471\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26380368098159507,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.26380368098159507,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.16071428571428573,\n\
\ \"acc_stderr\": 0.034859460964757415,\n \"acc_norm\": 0.16071428571428573,\n\
\ \"acc_norm_stderr\": 0.034859460964757415\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3106796116504854,\n \"acc_stderr\": 0.0458212416016155,\n\
\ \"acc_norm\": 0.3106796116504854,\n \"acc_norm_stderr\": 0.0458212416016155\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24786324786324787,\n\
\ \"acc_stderr\": 0.028286324075564397,\n \"acc_norm\": 0.24786324786324787,\n\
\ \"acc_norm_stderr\": 0.028286324075564397\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2554278416347382,\n\
\ \"acc_stderr\": 0.015594955384455779,\n \"acc_norm\": 0.2554278416347382,\n\
\ \"acc_norm_stderr\": 0.015594955384455779\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.025058503316958143,\n\
\ \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.025058503316958143\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26366559485530544,\n\
\ \"acc_stderr\": 0.02502553850053234,\n \"acc_norm\": 0.26366559485530544,\n\
\ \"acc_norm_stderr\": 0.02502553850053234\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2006172839506173,\n \"acc_stderr\": 0.022282313949774885,\n\
\ \"acc_norm\": 0.2006172839506173,\n \"acc_norm_stderr\": 0.022282313949774885\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.026011992930902016,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.026011992930902016\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2392438070404172,\n\
\ \"acc_stderr\": 0.010896123652676651,\n \"acc_norm\": 0.2392438070404172,\n\
\ \"acc_norm_stderr\": 0.010896123652676651\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.03018753206032938,\n\
\ \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.03018753206032938\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.23529411764705882,\n \"acc_stderr\": 0.01716058723504635,\n \
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.01716058723504635\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3181818181818182,\n\
\ \"acc_stderr\": 0.04461272175910507,\n \"acc_norm\": 0.3181818181818182,\n\
\ \"acc_norm_stderr\": 0.04461272175910507\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.17959183673469387,\n \"acc_stderr\": 0.024573293589585637,\n\
\ \"acc_norm\": 0.17959183673469387,\n \"acc_norm_stderr\": 0.024573293589585637\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n\
\ \"acc_stderr\": 0.029929415408348398,\n \"acc_norm\": 0.23383084577114427,\n\
\ \"acc_norm_stderr\": 0.029929415408348398\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.15060240963855423,\n\
\ \"acc_stderr\": 0.02784386378726433,\n \"acc_norm\": 0.15060240963855423,\n\
\ \"acc_norm_stderr\": 0.02784386378726433\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.031267817146631786,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.031267817146631786\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22643818849449204,\n\
\ \"mc1_stderr\": 0.014651337324602578,\n \"mc2\": 0.38176130958688614,\n\
\ \"mc2_stderr\": 0.014298868021565533\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5666929755327546,\n \"acc_stderr\": 0.013926915052757347\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/BFauber/opt1.3b_10e5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|arc:challenge|25_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|gsm8k|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hellaswag|10_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T19-22-53.886717.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T19-22-53.886717.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- '**/details_harness|winogrande|5_2024-02-02T19-22-53.886717.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T19-22-53.886717.parquet'
- config_name: results
data_files:
- split: 2024_02_02T19_22_53.886717
path:
- results_2024-02-02T19-22-53.886717.parquet
- split: latest
path:
- results_2024-02-02T19-22-53.886717.parquet
---
# Dataset Card for Evaluation run of BFauber/opt1.3b_10e5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/opt1.3b_10e5](https://huggingface.co/BFauber/opt1.3b_10e5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__opt1.3b_10e5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T19:22:53.886717](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt1.3b_10e5/blob/main/results_2024-02-02T19-22-53.886717.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25956926248319,
"acc_stderr": 0.030774098043396373,
"acc_norm": 0.26133458633537765,
"acc_norm_stderr": 0.031590043549587034,
"mc1": 0.22643818849449204,
"mc1_stderr": 0.014651337324602578,
"mc2": 0.38176130958688614,
"mc2_stderr": 0.014298868021565533
},
"harness|arc:challenge|25": {
"acc": 0.2645051194539249,
"acc_stderr": 0.012889272949313368,
"acc_norm": 0.295221843003413,
"acc_norm_stderr": 0.013329750293382318
},
"harness|hellaswag|10": {
"acc": 0.4070902210714997,
"acc_stderr": 0.004902878806733043,
"acc_norm": 0.5280820553674567,
"acc_norm_stderr": 0.004981905293878152
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073462,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073462
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.25,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3018867924528302,
"acc_stderr": 0.028254200344438662,
"acc_norm": 0.3018867924528302,
"acc_norm_stderr": 0.028254200344438662
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.043898699568087785,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.043898699568087785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036622,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036622
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.225531914893617,
"acc_stderr": 0.027321078417387536,
"acc_norm": 0.225531914893617,
"acc_norm_stderr": 0.027321078417387536
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707841,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707841
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.02226181769240017,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.02226181769240017
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.038522733649243156,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.038522733649243156
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.26129032258064516,
"acc_stderr": 0.024993053397764815,
"acc_norm": 0.26129032258064516,
"acc_norm_stderr": 0.024993053397764815
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.03477691162163659,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.03477691162163659
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.030532892233932032,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.030532892233932032
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.30569948186528495,
"acc_stderr": 0.033248379397581594,
"acc_norm": 0.30569948186528495,
"acc_norm_stderr": 0.033248379397581594
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.35128205128205126,
"acc_stderr": 0.024203665177902796,
"acc_norm": 0.35128205128205126,
"acc_norm_stderr": 0.024203665177902796
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02730914058823018,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02730914058823018
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31932773109243695,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.31932773109243695,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389024,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3155963302752294,
"acc_stderr": 0.019926117513869666,
"acc_norm": 0.3155963302752294,
"acc_norm_stderr": 0.019926117513869666
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3425925925925926,
"acc_stderr": 0.03236585252602157,
"acc_norm": 0.3425925925925926,
"acc_norm_stderr": 0.03236585252602157
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.028756799629658342,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.028756799629658342
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.13004484304932734,
"acc_stderr": 0.02257451942417487,
"acc_norm": 0.13004484304932734,
"acc_norm_stderr": 0.02257451942417487
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.1984732824427481,
"acc_stderr": 0.03498149385462471,
"acc_norm": 0.1984732824427481,
"acc_norm_stderr": 0.03498149385462471
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26380368098159507,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.26380368098159507,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.16071428571428573,
"acc_stderr": 0.034859460964757415,
"acc_norm": 0.16071428571428573,
"acc_norm_stderr": 0.034859460964757415
},
"harness|hendrycksTest-management|5": {
"acc": 0.3106796116504854,
"acc_stderr": 0.0458212416016155,
"acc_norm": 0.3106796116504854,
"acc_norm_stderr": 0.0458212416016155
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24786324786324787,
"acc_stderr": 0.028286324075564397,
"acc_norm": 0.24786324786324787,
"acc_norm_stderr": 0.028286324075564397
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2554278416347382,
"acc_stderr": 0.015594955384455779,
"acc_norm": 0.2554278416347382,
"acc_norm_stderr": 0.015594955384455779
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.025058503316958143,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.025058503316958143
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.26366559485530544,
"acc_stderr": 0.02502553850053234,
"acc_norm": 0.26366559485530544,
"acc_norm_stderr": 0.02502553850053234
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2006172839506173,
"acc_stderr": 0.022282313949774885,
"acc_norm": 0.2006172839506173,
"acc_norm_stderr": 0.022282313949774885
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.026011992930902016,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.026011992930902016
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2392438070404172,
"acc_stderr": 0.010896123652676651,
"acc_norm": 0.2392438070404172,
"acc_norm_stderr": 0.010896123652676651
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.03018753206032938,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.03018753206032938
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.01716058723504635,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.01716058723504635
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.04461272175910507,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.04461272175910507
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17959183673469387,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.17959183673469387,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348398,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348398
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.15060240963855423,
"acc_stderr": 0.02784386378726433,
"acc_norm": 0.15060240963855423,
"acc_norm_stderr": 0.02784386378726433
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22643818849449204,
"mc1_stderr": 0.014651337324602578,
"mc2": 0.38176130958688614,
"mc2_stderr": 0.014298868021565533
},
"harness|winogrande|5": {
"acc": 0.5666929755327546,
"acc_stderr": 0.013926915052757347
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ShoukanLabs/OpenNiji-345001_380000 | ---
dataset_info:
features:
- name: image
dtype: image
- name: url
dtype: string
- name: prompt
dtype: string
- name: style
dtype: string
splits:
- name: train
num_bytes: 62648261804.144
num_examples: 34999
download_size: 55791960851
dataset_size: 62648261804.144
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "OpenNiji-345001_380000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ndavidson__Phi-2-openassistant | ---
pretty_name: Evaluation run of ndavidson/Phi-2-openassistant
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ndavidson/Phi-2-openassistant](https://huggingface.co/ndavidson/Phi-2-openassistant)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ndavidson__Phi-2-openassistant\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T19:54:23.691350](https://huggingface.co/datasets/open-llm-leaderboard/details_ndavidson__Phi-2-openassistant/blob/main/results_2024-04-15T19-54-23.691350.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5611014530980067,\n\
\ \"acc_stderr\": 0.03394011520806338,\n \"acc_norm\": 0.5621433515751842,\n\
\ \"acc_norm_stderr\": 0.03463883741520245,\n \"mc1\": 0.32558139534883723,\n\
\ \"mc1_stderr\": 0.016403989469907825,\n \"mc2\": 0.4795572227845841,\n\
\ \"mc2_stderr\": 0.014876041209400705\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5588737201365188,\n \"acc_stderr\": 0.014509747749064664,\n\
\ \"acc_norm\": 0.5964163822525598,\n \"acc_norm_stderr\": 0.014337158914268452\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5587532364070902,\n\
\ \"acc_stderr\": 0.00495521278783238,\n \"acc_norm\": 0.7448715395339573,\n\
\ \"acc_norm_stderr\": 0.004350424750646201\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.040179012759817494,\n\
\ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.040179012759817494\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.03019761160019795,\n\
\ \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.03019761160019795\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.625,\n\
\ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.625,\n \
\ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562427,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562427\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.043391383225798615,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.043391383225798615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.042407993275749255,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.042407993275749255\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6516129032258065,\n\
\ \"acc_stderr\": 0.02710482632810094,\n \"acc_norm\": 0.6516129032258065,\n\
\ \"acc_norm_stderr\": 0.02710482632810094\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.03713158067481912,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.03713158067481912\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7171717171717171,\n \"acc_stderr\": 0.032087795587867514,\n \"\
acc_norm\": 0.7171717171717171,\n \"acc_norm_stderr\": 0.032087795587867514\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7409326424870466,\n \"acc_stderr\": 0.031618779179354115,\n\
\ \"acc_norm\": 0.7409326424870466,\n \"acc_norm_stderr\": 0.031618779179354115\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5871794871794872,\n \"acc_stderr\": 0.024962683564331796,\n\
\ \"acc_norm\": 0.5871794871794872,\n \"acc_norm_stderr\": 0.024962683564331796\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608466,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608466\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6008403361344538,\n \"acc_stderr\": 0.03181110032413926,\n \
\ \"acc_norm\": 0.6008403361344538,\n \"acc_norm_stderr\": 0.03181110032413926\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7834862385321101,\n \"acc_stderr\": 0.017658710594443138,\n \"\
acc_norm\": 0.7834862385321101,\n \"acc_norm_stderr\": 0.017658710594443138\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39814814814814814,\n \"acc_stderr\": 0.033384734032074016,\n \"\
acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6421568627450981,\n \"acc_stderr\": 0.03364487286088299,\n \"\
acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.03364487286088299\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n \
\ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\
\ \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n\
\ \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677697,\n\
\ \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677697\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.038968789850704164,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.038968789850704164\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664743,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664743\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729224,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.02514093595033544,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.02514093595033544\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6883780332056194,\n\
\ \"acc_stderr\": 0.016562433867284176,\n \"acc_norm\": 0.6883780332056194,\n\
\ \"acc_norm_stderr\": 0.016562433867284176\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.025624723994030454,\n\
\ \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.025624723994030454\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23575418994413408,\n\
\ \"acc_stderr\": 0.014196375686290804,\n \"acc_norm\": 0.23575418994413408,\n\
\ \"acc_norm_stderr\": 0.014196375686290804\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5849673202614379,\n \"acc_stderr\": 0.028213504177824093,\n\
\ \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.028213504177824093\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5755627009646302,\n\
\ \"acc_stderr\": 0.028071928247946205,\n \"acc_norm\": 0.5755627009646302,\n\
\ \"acc_norm_stderr\": 0.028071928247946205\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.02712511551316686,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.02712511551316686\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.425531914893617,\n \"acc_stderr\": 0.02949482760014437,\n \
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.02949482760014437\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.40808344198174706,\n\
\ \"acc_stderr\": 0.012552598958563662,\n \"acc_norm\": 0.40808344198174706,\n\
\ \"acc_norm_stderr\": 0.012552598958563662\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.030134614954403924,\n \
\ \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.030134614954403924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5424836601307189,\n \"acc_stderr\": 0.020154685712590888,\n \
\ \"acc_norm\": 0.5424836601307189,\n \"acc_norm_stderr\": 0.020154685712590888\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.04673752333670238,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.04673752333670238\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6204081632653061,\n \"acc_stderr\": 0.031067211262872475,\n\
\ \"acc_norm\": 0.6204081632653061,\n \"acc_norm_stderr\": 0.031067211262872475\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7611940298507462,\n\
\ \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.7611940298507462,\n\
\ \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"\
acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6608187134502924,\n \"acc_stderr\": 0.03631053496488904,\n\
\ \"acc_norm\": 0.6608187134502924,\n \"acc_norm_stderr\": 0.03631053496488904\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32558139534883723,\n\
\ \"mc1_stderr\": 0.016403989469907825,\n \"mc2\": 0.4795572227845841,\n\
\ \"mc2_stderr\": 0.014876041209400705\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7521704814522494,\n \"acc_stderr\": 0.012134386019865355\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5322213798332069,\n \
\ \"acc_stderr\": 0.01374385730307378\n }\n}\n```"
repo_url: https://huggingface.co/ndavidson/Phi-2-openassistant
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|arc:challenge|25_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|gsm8k|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hellaswag|10_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T19-54-23.691350.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T19-54-23.691350.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- '**/details_harness|winogrande|5_2024-04-15T19-54-23.691350.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T19-54-23.691350.parquet'
- config_name: results
data_files:
- split: 2024_04_15T19_54_23.691350
path:
- results_2024-04-15T19-54-23.691350.parquet
- split: latest
path:
- results_2024-04-15T19-54-23.691350.parquet
---
# Dataset Card for Evaluation run of ndavidson/Phi-2-openassistant
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ndavidson/Phi-2-openassistant](https://huggingface.co/ndavidson/Phi-2-openassistant) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ndavidson__Phi-2-openassistant",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T19:54:23.691350](https://huggingface.co/datasets/open-llm-leaderboard/details_ndavidson__Phi-2-openassistant/blob/main/results_2024-04-15T19-54-23.691350.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5611014530980067,
"acc_stderr": 0.03394011520806338,
"acc_norm": 0.5621433515751842,
"acc_norm_stderr": 0.03463883741520245,
"mc1": 0.32558139534883723,
"mc1_stderr": 0.016403989469907825,
"mc2": 0.4795572227845841,
"mc2_stderr": 0.014876041209400705
},
"harness|arc:challenge|25": {
"acc": 0.5588737201365188,
"acc_stderr": 0.014509747749064664,
"acc_norm": 0.5964163822525598,
"acc_norm_stderr": 0.014337158914268452
},
"harness|hellaswag|10": {
"acc": 0.5587532364070902,
"acc_stderr": 0.00495521278783238,
"acc_norm": 0.7448715395339573,
"acc_norm_stderr": 0.004350424750646201
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.040179012759817494,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.040179012759817494
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5962264150943396,
"acc_stderr": 0.03019761160019795,
"acc_norm": 0.5962264150943396,
"acc_norm_stderr": 0.03019761160019795
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.625,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562427,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562427
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006717,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006717
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.043391383225798615,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.043391383225798615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.042407993275749255,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.042407993275749255
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6516129032258065,
"acc_stderr": 0.02710482632810094,
"acc_norm": 0.6516129032258065,
"acc_norm_stderr": 0.02710482632810094
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.03713158067481912,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.03713158067481912
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7171717171717171,
"acc_stderr": 0.032087795587867514,
"acc_norm": 0.7171717171717171,
"acc_norm_stderr": 0.032087795587867514
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7409326424870466,
"acc_stderr": 0.031618779179354115,
"acc_norm": 0.7409326424870466,
"acc_norm_stderr": 0.031618779179354115
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5871794871794872,
"acc_stderr": 0.024962683564331796,
"acc_norm": 0.5871794871794872,
"acc_norm_stderr": 0.024962683564331796
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608466,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608466
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6008403361344538,
"acc_stderr": 0.03181110032413926,
"acc_norm": 0.6008403361344538,
"acc_norm_stderr": 0.03181110032413926
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7834862385321101,
"acc_stderr": 0.017658710594443138,
"acc_norm": 0.7834862385321101,
"acc_norm_stderr": 0.017658710594443138
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.03364487286088299,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.03364487286088299
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677697,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677697
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.038968789850704164,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.038968789850704164
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729224,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.02514093595033544,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.02514093595033544
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6883780332056194,
"acc_stderr": 0.016562433867284176,
"acc_norm": 0.6883780332056194,
"acc_norm_stderr": 0.016562433867284176
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.025624723994030454,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.025624723994030454
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23575418994413408,
"acc_stderr": 0.014196375686290804,
"acc_norm": 0.23575418994413408,
"acc_norm_stderr": 0.014196375686290804
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5849673202614379,
"acc_stderr": 0.028213504177824093,
"acc_norm": 0.5849673202614379,
"acc_norm_stderr": 0.028213504177824093
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5755627009646302,
"acc_stderr": 0.028071928247946205,
"acc_norm": 0.5755627009646302,
"acc_norm_stderr": 0.028071928247946205
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.02712511551316686,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.02712511551316686
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.02949482760014437,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.02949482760014437
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.40808344198174706,
"acc_stderr": 0.012552598958563662,
"acc_norm": 0.40808344198174706,
"acc_norm_stderr": 0.012552598958563662
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4375,
"acc_stderr": 0.030134614954403924,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.030134614954403924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5424836601307189,
"acc_stderr": 0.020154685712590888,
"acc_norm": 0.5424836601307189,
"acc_norm_stderr": 0.020154685712590888
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670238,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670238
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6204081632653061,
"acc_stderr": 0.031067211262872475,
"acc_norm": 0.6204081632653061,
"acc_norm_stderr": 0.031067211262872475
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7611940298507462,
"acc_stderr": 0.03014777593540922,
"acc_norm": 0.7611940298507462,
"acc_norm_stderr": 0.03014777593540922
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6608187134502924,
"acc_stderr": 0.03631053496488904,
"acc_norm": 0.6608187134502924,
"acc_norm_stderr": 0.03631053496488904
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32558139534883723,
"mc1_stderr": 0.016403989469907825,
"mc2": 0.4795572227845841,
"mc2_stderr": 0.014876041209400705
},
"harness|winogrande|5": {
"acc": 0.7521704814522494,
"acc_stderr": 0.012134386019865355
},
"harness|gsm8k|5": {
"acc": 0.5322213798332069,
"acc_stderr": 0.01374385730307378
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Bluebomber182/AI-Emotions | ---
license: unknown
---
|
averoo/lurk | ---
license: mit
dataset_info:
features:
- name: header
dtype: string
- name: text
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 91683615
num_examples: 5671
download_size: 48923745
dataset_size: 91683615
---
|
sunhaozhepy/sst_rake_keywords | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: float32
- name: tokens
dtype: string
- name: tree
dtype: string
- name: keywords
dtype: string
splits:
- name: train
num_bytes: 3239160
num_examples: 8544
- name: validation
num_bytes: 420858
num_examples: 1101
- name: test
num_bytes: 838797
num_examples: 2210
download_size: 2784205
dataset_size: 4498815
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_bunnycore__Mnemosyne-7B | ---
pretty_name: Evaluation run of bunnycore/Mnemosyne-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bunnycore/Mnemosyne-7B](https://huggingface.co/bunnycore/Mnemosyne-7B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bunnycore__Mnemosyne-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-08T22:25:39.468418](https://huggingface.co/datasets/open-llm-leaderboard/details_bunnycore__Mnemosyne-7B/blob/main/results_2024-04-08T22-25-39.468418.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6261050202251396,\n\
\ \"acc_stderr\": 0.0327243359383022,\n \"acc_norm\": 0.6291470943507816,\n\
\ \"acc_norm_stderr\": 0.03338065256024651,\n \"mc1\": 0.4638922888616891,\n\
\ \"mc1_stderr\": 0.017457800422268622,\n \"mc2\": 0.6267167564797568,\n\
\ \"mc2_stderr\": 0.015372099067347617\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6126279863481229,\n \"acc_stderr\": 0.01423587248790987,\n\
\ \"acc_norm\": 0.6544368600682594,\n \"acc_norm_stderr\": 0.01389693846114568\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6516630153355906,\n\
\ \"acc_stderr\": 0.004754697013354958,\n \"acc_norm\": 0.8472415853415655,\n\
\ \"acc_norm_stderr\": 0.0035901923719696563\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933713,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.047551296160629475,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.047551296160629475\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n\
\ \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\
\ \"acc_stderr\": 0.046306532033665956,\n \"acc_norm\": 0.41228070175438597,\n\
\ \"acc_norm_stderr\": 0.046306532033665956\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n\
\ \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3994708994708995,\n \"acc_stderr\": 0.02522545028406788,\n \"\
acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.02522545028406788\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6290322580645161,\n\
\ \"acc_stderr\": 0.027480541887953593,\n \"acc_norm\": 0.6290322580645161,\n\
\ \"acc_norm_stderr\": 0.027480541887953593\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790492,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790492\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5974358974358974,\n \"acc_stderr\": 0.02486499515976775,\n \
\ \"acc_norm\": 0.5974358974358974,\n \"acc_norm_stderr\": 0.02486499515976775\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083015,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083015\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.818348623853211,\n \"acc_stderr\": 0.016530617409266857,\n \"\
acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266857\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"\
acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.026558372502661916,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.026558372502661916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.03210062154134987,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.03210062154134987\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.020237149008990925,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.020237149008990925\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n\
\ \"acc_stderr\": 0.01396439376989913,\n \"acc_norm\": 0.8122605363984674,\n\
\ \"acc_norm_stderr\": 0.01396439376989913\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.024685316867257803,\n\
\ \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.024685316867257803\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37988826815642457,\n\
\ \"acc_stderr\": 0.0162328268186785,\n \"acc_norm\": 0.37988826815642457,\n\
\ \"acc_norm_stderr\": 0.0162328268186785\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292452,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292452\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195448,\n\
\ \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195448\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46088657105606257,\n\
\ \"acc_stderr\": 0.012731102790504517,\n \"acc_norm\": 0.46088657105606257,\n\
\ \"acc_norm_stderr\": 0.012731102790504517\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n\
\ \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6421568627450981,\n \"acc_stderr\": 0.019393058402355435,\n \
\ \"acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.019393058402355435\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03333333333333335,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03333333333333335\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4638922888616891,\n\
\ \"mc1_stderr\": 0.017457800422268622,\n \"mc2\": 0.6267167564797568,\n\
\ \"mc2_stderr\": 0.015372099067347617\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7892659826361483,\n \"acc_stderr\": 0.011462046419710686\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5208491281273692,\n \
\ \"acc_stderr\": 0.013760506094029868\n }\n}\n```"
repo_url: https://huggingface.co/bunnycore/Mnemosyne-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|arc:challenge|25_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|gsm8k|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hellaswag|10_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T22-25-39.468418.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-08T22-25-39.468418.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- '**/details_harness|winogrande|5_2024-04-08T22-25-39.468418.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-08T22-25-39.468418.parquet'
- config_name: results
data_files:
- split: 2024_04_08T22_25_39.468418
path:
- results_2024-04-08T22-25-39.468418.parquet
- split: latest
path:
- results_2024-04-08T22-25-39.468418.parquet
---
# Dataset Card for Evaluation run of bunnycore/Mnemosyne-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [bunnycore/Mnemosyne-7B](https://huggingface.co/bunnycore/Mnemosyne-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bunnycore__Mnemosyne-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-08T22:25:39.468418](https://huggingface.co/datasets/open-llm-leaderboard/details_bunnycore__Mnemosyne-7B/blob/main/results_2024-04-08T22-25-39.468418.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6261050202251396,
"acc_stderr": 0.0327243359383022,
"acc_norm": 0.6291470943507816,
"acc_norm_stderr": 0.03338065256024651,
"mc1": 0.4638922888616891,
"mc1_stderr": 0.017457800422268622,
"mc2": 0.6267167564797568,
"mc2_stderr": 0.015372099067347617
},
"harness|arc:challenge|25": {
"acc": 0.6126279863481229,
"acc_stderr": 0.01423587248790987,
"acc_norm": 0.6544368600682594,
"acc_norm_stderr": 0.01389693846114568
},
"harness|hellaswag|10": {
"acc": 0.6516630153355906,
"acc_stderr": 0.004754697013354958,
"acc_norm": 0.8472415853415655,
"acc_norm_stderr": 0.0035901923719696563
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933713,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.047551296160629475,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.047551296160629475
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.046306532033665956,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.046306532033665956
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.02522545028406788,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.02522545028406788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6290322580645161,
"acc_stderr": 0.027480541887953593,
"acc_norm": 0.6290322580645161,
"acc_norm_stderr": 0.027480541887953593
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790492,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790492
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5974358974358974,
"acc_stderr": 0.02486499515976775,
"acc_norm": 0.5974358974358974,
"acc_norm_stderr": 0.02486499515976775
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083015,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083015
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.818348623853211,
"acc_stderr": 0.016530617409266857,
"acc_norm": 0.818348623853211,
"acc_norm_stderr": 0.016530617409266857
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.026558372502661916,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.026558372502661916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.03210062154134987,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.03210062154134987
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.020237149008990925,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.020237149008990925
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.01396439376989913,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.01396439376989913
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.024685316867257803,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.024685316867257803
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37988826815642457,
"acc_stderr": 0.0162328268186785,
"acc_norm": 0.37988826815642457,
"acc_norm_stderr": 0.0162328268186785
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292452,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292452
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.025483115601195448,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.025483115601195448
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46088657105606257,
"acc_stderr": 0.012731102790504517,
"acc_norm": 0.46088657105606257,
"acc_norm_stderr": 0.012731102790504517
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.019393058402355435,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.019393058402355435
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03333333333333335,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03333333333333335
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4638922888616891,
"mc1_stderr": 0.017457800422268622,
"mc2": 0.6267167564797568,
"mc2_stderr": 0.015372099067347617
},
"harness|winogrande|5": {
"acc": 0.7892659826361483,
"acc_stderr": 0.011462046419710686
},
"harness|gsm8k|5": {
"acc": 0.5208491281273692,
"acc_stderr": 0.013760506094029868
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
cheshietext/datascope1 | ---
license: zlib
---
|
Thanmay/commonsense_qa-gu | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: question_concept
dtype: string
- name: choices
sequence:
- name: label
dtype: string
- name: text
dtype: string
- name: answerKey
dtype: string
- name: itv2 gu question
dtype: string
splits:
- name: validation
num_bytes: 493203
num_examples: 1221
- name: test
num_bytes: 468965
num_examples: 1140
download_size: 492913
dataset_size: 962168
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
barkermrl/mnist-c | ---
license: apache-2.0
---
Source: [https://github.com/google-research/mnist-c](https://github.com/google-research/mnist-c)
# MNIST-C
This repository contains the source code used to create the MNIST-C dataset, a
corrupted MNIST benchmark for testing out-of-distribution robustness of computer
vision models.
Please see our full paper [https://arxiv.org/abs/1906.02337](https://arxiv.org/abs/1906.02337) for more details.
## Dataset
The static dataset is available for download at [https://zenodo.org/record/3239543](https://zenodo.org/record/3239543). |
lapp0/query_expansion | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 235625
num_examples: 1421
- name: eval
num_bytes: 11761
num_examples: 74
download_size: 154018
dataset_size: 247386
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: eval
path: data/eval-*
---
|
TrainingDataPro/biometric-attacks-in-different-lighting-conditions | ---
license: cc-by-nc-nd-4.0
task_categories:
- video-classification
language:
- en
tags:
- code
- legal
- finance
---
# Biometric Attacks in Different Lighting Conditions Dataset
The dataset consists of videos of individuals and attacks with photos shown in the monitor . Videos are filmed in different lightning conditions (*in a dark room, daylight, light room and nightlight*) and in different places (*indoors, outdoors*). Each video in the dataset has an approximate duration of 20 seconds.
### Types of videos in the dataset:
- **darkroom_photo** - photo of a person in a **dark room** shown on a computer and filmed on the phone
- **daylight_photo** - photo of a person in a **daylight** shown on a computer and filmed on the phone
- **lightroom_photo** - photo of a person in a **light room** shown on a computer and filmed on the phone
- **nightlight_photo** - photo of a person in a **night light** shown on a computer and filmed on the phone
- **darkroom_video** - filmed in a **dark room**, on which a person moves his/her head left, right, up and down
- **daylight_video** - filmed in a **daylight**, on which a person moves his/her head left, right, up and down
- **lightroom_video** - filmed in a **light room**, on which a person moves his/her head left, right, up and down
- **nightlight_video** - filmed in a **night light**, on which a person moves his/her head left, right, up and down
- **mask** - video of the person wearing a **printed 2D mask**
- **outline** - video of the person wearing a **printed 2D mask with cut-out holes for eyes**
- **monitor_video** - video of a person played on a computer and filmed on the phone
.png?generation=1691658152306937&alt=media)
The dataset serves as a valuable resource for computer vision, anti-spoofing tasks, video analysis, and security systems. It allows for the development of algorithms and models that can effectively detect attacks.
Studying the dataset may lead to the development of improved security systems, surveillance technologies, and solutions to mitigate the risks associated with masked individuals carrying out attacks.
# Get the dataset
### This is just an example of the data
Leave a request on [**https://trainingdata.pro/data-market**](https://trainingdata.pro/data-market?utm_source=huggingface&utm_medium=cpc&utm_campaign=biometric-attacks-in-different-lighting-conditions) to discuss your requirements, learn about the price and buy the dataset.
# Content
- **files** - contains of original videos and videos of attacks,
- **dataset_info.csvl** - includes the information about videos in the dataset
### File with the extension .csv
- **file**: link to the video,
- **type**: type of the video
# Attacks might be collected in accordance with your requirements.
## [**TrainingData**](https://trainingdata.pro/data-market?utm_source=huggingface&utm_medium=cpc&utm_campaign=biometric-attacks-in-different-lighting-conditions) provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **https://www.kaggle.com/trainingdatapro/datasets**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets** |
HaltiaAI/Her-The-Movie-Samantha-and-Theodore-Dataset | ---
license: other
tags:
- Movie Dialog
- Her The Movie
- Dialogs from the Her Movie (2013)
--- |
mango19918/aimodels | ---
license: openrail
---
All of my models posted on AI HUB |
wyluilipe/ru-paraphrases | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: original
dtype: string
- name: paraphrase
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 687174801.2
num_examples: 1881000
- name: test
num_bytes: 36167094.8
num_examples: 99000
download_size: 421032139
dataset_size: 723341896.0
---
# Dataset Card for "ru-paraphrases"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ckail/AoJiao-YaLiZi | ---
license: mit
---
|
nateraw/fuego-20230205-205350-24e650 | ---
tags:
- fuego
fuego:
id: 20230205-205350-24e650
status: running
script: run_glue.py
requirements_file: requirements.txt
space_id: nateraw/fuego-20230205-205350-24e650
space_hardware: cpu-basic
github_repo_id: huggingface/transformers
github_repo_branch: main
github_repo_sha: 59d5edef34ae0fa56065a2e863736d4f133c558b
---
|
DucHaiten/Crazy-Town | ---
license: openrail
---
|
naorm/all-captions | ---
dataset_info:
features:
- name: image
dtype: image
- name: hf-git-base
dtype: string
- name: hf-git-large
dtype: string
- name: hf-blip-base
dtype: string
- name: hf-blip-large
dtype: string
- name: lavis-blip-base
dtype: string
- name: lavis-blip-large
dtype: string
- name: hf-git-base-coco
dtype: string
- name: hf-git-large-coco
dtype: string
splits:
- name: train
num_bytes: 813403802.0
num_examples: 5000
download_size: 814281470
dataset_size: 813403802.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
crylake/fill50k_vi | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: conditioning_image
dtype: image
- name: text
dtype: string
- name: vi_text
dtype: string
splits:
- name: train
num_bytes: 456972354.0
num_examples: 50000
download_size: 326272883
dataset_size: 456972354.0
---
# Dataset Card for "fill50k_vi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Kaiio1/elliot | ---
license: openrail
---
|
r76941156/MPRINT2 | ---
license: apache-2.0
---
|
FaalSa/cluster0_1 | ---
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: target
sequence: float32
- name: item_id
dtype: string
- name: feat_static_cat
sequence: uint64
splits:
- name: train
num_bytes: 78904
num_examples: 2
- name: validation
num_bytes: 79864
num_examples: 2
- name: test
num_bytes: 80824
num_examples: 2
download_size: 31759
dataset_size: 239592
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
growth-cadet/eval02_gpt4_jobpost | ---
dataset_info:
features:
- name: id
dtype: string
- name: ats
dtype: string
- name: context
dtype: string
- name: context_token_count
dtype: int64
- name: gpt-4_response
dtype: string
- name: gpt-4_cost
dtype: float64
- name: gpt-4_sys5_response
dtype: string
- name: gpt-4_sys5_cost
dtype: float64
- name: sys5_obj
struct:
- name: focus_areas
list:
- name: description
dtype: string
- name: subject
dtype: string
- name: industries
list:
- name: description
dtype: string
- name: subject
dtype: string
- name: products_and_technologies
list:
- name: description
dtype: string
- name: subject
dtype: string
- name: mistral01_gen
dtype: string
- name: eval_crit
struct:
- name: focus_areas
dtype: float64
- name: industries
dtype: float64
- name: products_and_technologies
dtype: float64
- name: eval_values
struct:
- name: focus_areas
sequence: int64
- name: industries
sequence: int64
- name: products_and_technologies
sequence: int64
splits:
- name: train
num_bytes: 29522941
num_examples: 3352
download_size: 13732560
dataset_size: 29522941
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
huggingface-projects/color-palettes-sd | ---
license: cc-by-4.0
---
|
EleutherAI/quirky_sciq_alice_hard | ---
dataset_info:
features:
- name: id
dtype: string
- name: choices
sequence: string
- name: label
dtype: int64
- name: difficulty
dtype: float64
- name: statement
dtype: string
- name: character
dtype: string
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
splits:
- name: train
num_bytes: 747497.4535258075
num_examples: 1205
- name: validation
num_bytes: 132886.768
num_examples: 224
- name: test
num_bytes: 157273.79
num_examples: 265
download_size: 314869
dataset_size: 1037658.0115258076
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
enoahjr/twitter_dataset_1713175765 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 297150
num_examples: 840
download_size: 142018
dataset_size: 297150
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
one-sec-cv12/chunk_25 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 15838891488.75
num_examples: 164906
download_size: 14036326858
dataset_size: 15838891488.75
---
# Dataset Card for "chunk_25"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ruanchaves/assin_por_Latn_to_glg_Latn | ---
dataset_info:
features:
- name: sentence_pair_id
dtype: int64
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: relatedness_score
dtype: float32
- name: entailment_judgment
dtype:
class_label:
names:
'0': NONE
'1': ENTAILMENT
'2': PARAPHRASE
- name: __language__
dtype: string
splits:
- name: train
num_bytes: 1005495
num_examples: 5000
- name: test
num_bytes: 781854
num_examples: 4000
- name: validation
num_bytes: 201144
num_examples: 1000
download_size: 0
dataset_size: 1988493
---
# Dataset Card for "assin_por_Latn_to_glg_Latn"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RyanZZZZZ/w5_dev_all_input_bhc_sampled | ---
dataset_info:
features:
- name: input
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 4489574.964331816
num_examples: 300
download_size: 2332107
dataset_size: 4489574.964331816
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
reshinthadith/basic_code_ppl_eval | ---
license: apache-2.0
task_categories:
- text-generation
tags:
- code
size_categories:
- 1K<n<10K
--- |
liuyanchen1015/MULTI_VALUE_sst2_regularized_reflexives | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 816
num_examples: 5
- name: test
num_bytes: 2037
num_examples: 11
- name: train
num_bytes: 34953
num_examples: 272
download_size: 19711
dataset_size: 37806
---
# Dataset Card for "MULTI_VALUE_sst2_regularized_reflexives"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
David-Xu/atels_embedding_all | ---
dataset_info:
features:
- name: atelNum
dtype: int64
- name: title
dtype: string
- name: authors
dtype: string
- name: body
dtype: string
- name: submissionDate
dtype: string
- name: keywords
dtype: string
- name: submissionDay
dtype: string
- name: submissionTime
dtype: string
- name: length_of_text
dtype: int64
- name: embedding
sequence: float64
- name: idx
dtype: int64
splits:
- name: train
num_bytes: 74345499
num_examples: 15238
download_size: 53935272
dataset_size: 74345499
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
davanstrien/raw-tldr-dataset-preference | ---
dataset_info:
features:
- name: datasetId
dtype: string
- name: author
dtype: string
- name: last_modified
dtype: timestamp[us, tz=UTC]
- name: downloads
dtype: int64
- name: likes
dtype: int64
- name: tags
sequence: string
- name: task_categories
sequence: string
- name: createdAt
dtype: timestamp[us, tz=UTC]
- name: card
dtype: string
- name: parsed_card
dtype: string
- name: length
dtype: int64
- name: input
dtype: string
- name: generation_model
sequence: string
- name: generation_prompt
sequence: string
- name: raw_generation_responses
sequence: string
- name: generations
sequence: string
splits:
- name: train
num_bytes: 41241887
num_examples: 1000
download_size: 15466212
dataset_size: 41241887
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
HuggingFaceM4/NoCaps | ---
license: cc-by-2.0
---
# Dataset Card for NoCaps
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://nocaps.org/](https://nocaps.org/)
- **Paper:** [nocaps: novel object captioning at scale](https://openaccess.thecvf.com/content_ICCV_2019/papers/Agrawal_nocaps_novel_object_captioning_at_scale_ICCV_2019_paper.pdf)
- **Leaderboard:**
- **Point of Contact:**: contact@nocaps.org
### Dataset Summary
Dubbed NoCaps for novel object captioning at scale, NoCaps consists of 166,100 human-generated captions describing 15,100 images from the Open Images validation and test sets.
The associated training data consists of COCO image-caption pairs, plus Open Images image-level labels and object bounding boxes.
Since Open Images contains many more classes than COCO, nearly 400 object classes seen in test images have no or very few associated training captions (hence, nocaps).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
Each instance has the following structure:
```
{
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=L size=732x1024 at 0x7F574A3A9B50>,
'image_coco_url': 'https://s3.amazonaws.com/nocaps/val/0013ea2087020901.jpg',
'image_date_captured': '2018-11-06 11:04:33',
'image_file_name': '0013ea2087020901.jpg',
'image_height': 1024,
'image_width': 732,
'image_id': 0,
'image_license': 0,
'image_open_images_id': '0013ea2087020901',
'annotations_ids': [0, 1, 2, 3, 4, 5, 6, 7, 8, 9],
'annotations_captions': [
'A baby is standing in front of a house.',
'A little girl in a white jacket and sandals.',
'A young child stands in front of a house.',
'A child is wearing a white shirt and standing on a side walk. ',
'A little boy is standing in his diaper with a white shirt on.',
'A child wearing a diaper and shoes stands on the sidewalk.',
'A child is wearing a light-colored shirt during the daytime.',
'A little kid standing on the pavement in a shirt. ',
'Black and white photo of a little girl smiling.',
'a cute baby is standing alone with white shirt'
]
}
```
### Data Fields
- `image`: The image
- `image_coco_url`: URL for the image
- `image_date_captured`: Date at which the image was captured
- `image_file_name`: The file name for the image
- `image_height`: Height of the image
- `image_width`: Width of the image
- `image_id`: Id of the image
- `image_license`: Not sure what this is, it is always at 0
- `image_open_images_id`: Open image id
- `annotations_ids`: Unique ids for the captions (to use in conjunction with `annotations_captions`)
- `annotations_captions`: Captions for the image (to use in conjunction with `annotations_ids`)
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@VictorSanh](https://github.com/VictorSanh) for adding this dataset. |
zwli/GroundingGPT | ---
license: apache-2.0
---
GroundingGPT |
Ziggy1/dataset | ---
license: apache-2.0
---
|
ars-1/autotrain-data-javascript-traing-1 | ---
task_categories:
- summarization
---
# AutoTrain Dataset for project: javascript-traing-1
## Dataset Description
This dataset has been automatically processed by AutoTrain for project javascript-traing-1.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"target": "test/NavbarSpec.js",
"feat_repo_name": "aabenoja/react-bootstrap",
"text": "import React from 'react';\nimport ReactTestUtils from 'react/lib/ReactTestUtils';\nimport Navbar from '../src/Navbar';\nimport Nav from '../src/Nav';\n\ndescribe('Nav', function () {\n\n it('Should create nav element', function () {\n let instance = ReactTestUtils.renderIntoDocument(\n <Navbar />\n );\n let nav = instance.getDOMNode();\n assert.equal(nav.nodeName, 'NAV');\n assert.ok(nav.className.match(/\\bnavbar\\b/));\n assert.ok(nav.getAttribute('role'), 'navigation');\n });\n\n it('Should add fixedTop variation class', function () {\n let instance = ReactTestUtils.renderIntoDocument(\n <Navbar fixedTop />\n );\n assert.ok(ReactTestUtils.findRenderedDOMComponentWithClass(instance, 'navbar-fixed-top'));\n });\n\n it('Should add fixedBottom variation class', function () {\n let instance = ReactTestUtils.renderIntoDocument(\n <Navbar fixedBottom />\n );\n assert.ok(ReactTestUtils.findRenderedDOMComponentWithClass(instance, 'navbar-fixed-bottom'));\n });\n\n it('Should add staticTop variation class', function () {\n let instance = ReactTestUtils.renderIntoDocument(\n <Navbar staticTop />\n );\n assert.ok(ReactTestUtils.findRenderedDOMComponentWithClass(instance, 'navbar-static-top'));\n });\n\n it('Should add inverse variation class', function () {\n let instance = ReactTestUtils.renderIntoDocument(\n <Navbar inverse />\n );\n assert.ok(ReactTestUtils.findRenderedDOMComponentWithClass(instance, 'navbar-inverse'));\n });\n\n it('Should add fluid variation class', function () {\n let instance = ReactTestUtils.renderIntoDocument(\n <Navbar fluid />\n );\n assert.ok(ReactTestUtils.findRenderedDOMComponentWithClass(instance, 'container-fluid'));\n });\n\n it('Should override role attribute', function () {\n let instance = ReactTestUtils.renderIntoDocument(\n <Navbar role=\"banner\"/>\n );\n assert.ok(instance.getDOMNode().getAttribute('role'), 'banner');\n });\n\n it('Should override node class', function () {\n let instance = ReactTestUtils.renderIntoDocument(\n <Navbar componentClass={'header'}/>\n );\n assert.ok(instance.getDOMNode().nodeName, 'HEADER');\n });\n\n it('Should add header with brand', function () {\n let instance = ReactTestUtils.renderIntoDocument(\n <Navbar brand=\"Brand\" />\n );\n\n let header = ReactTestUtils.findRenderedDOMComponentWithClass(instance, 'navbar-header');\n\n assert.ok(header);\n\n let brand = ReactTestUtils.findRenderedDOMComponentWithClass(header, 'navbar-brand');\n\n assert.ok(brand);\n assert.equal(brand.getDOMNode().innerText, 'Brand');\n });\n\n it('Should add header with brand component', function () {\n let instance = ReactTestUtils.renderIntoDocument(\n <Navbar brand={<a>Brand</a>} />\n );\n\n let header = ReactTestUtils.findRenderedDOMComponentWithClass(instance, 'navbar-header');\n\n assert.ok(header);\n\n let brand = ReactTestUtils.findRenderedDOMComponentWithClass(header, 'navbar-brand');\n\n assert.ok(brand);\n assert.equal(brand.getDOMNode().nodeName, 'A');\n assert.equal(brand.getDOMNode().innerText, 'Brand');\n });\n\n it('Should pass navbar prop to navs', function () {\n let instance = ReactTestUtils.renderIntoDocument(\n <Navbar brand=\"Brand\">\n <Nav />\n </Navbar>\n );\n\n let nav = ReactTestUtils.findRenderedComponentWithType(instance, Nav);\n\n assert.ok(nav.props.navbar);\n });\n\n it('Should pass nav prop to ul', function () {\n let instance = ReactTestUtils.renderIntoDocument(\n <Nav />\n );\n\n let navNode = ReactTestUtils.findRenderedDOMComponentWithClass(instance, 'nav').getDOMNode();\n assert.ok(navNode);\n assert.equal(navNode.nodeName, 'UL');\n assert.equal(navNode.parentNode.nodeName, 'NAV');\n\n instance.setProps({navbar: true});\n\n navNode = ReactTestUtils.findRenderedDOMComponentWithClass(instance, 'nav').getDOMNode();\n assert.ok(navNode);\n assert.equal(navNode.nodeName, 'UL');\n assert.equal(navNode.parentNode.nodeName, 'DIV');\n });\n\n it('Should add header when toggleNavKey is 0', function () {\n let instance = ReactTestUtils.renderIntoDocument(\n <Navbar toggleNavKey={0}>\n <Nav eventKey={0} />\n </Navbar>\n );\n\n let header = ReactTestUtils.findRenderedDOMComponentWithClass(instance, 'navbar-header');\n\n assert.ok(header);\n });\n\n it('Should add header when toggleNavKey is 1', function () {\n let instance = ReactTestUtils.renderIntoDocument(\n <Navbar toggleNavKey={1}>\n <Nav eventKey={1} />\n </Navbar>\n );\n\n let header = ReactTestUtils.findRenderedDOMComponentWithClass(instance, 'navbar-header');\n\n assert.ok(header);\n });\n\n it('Should add header when toggleNavKey is string', function () {\n let instance = ReactTestUtils.renderIntoDocument(\n <Navbar toggleNavKey={'string'}>\n <Nav eventKey={'string'} />\n </Navbar>\n );\n\n let header = ReactTestUtils.findRenderedDOMComponentWithClass(instance, 'navbar-header');\n\n assert.ok(header);\n });\n});\n"
},
{
"target": "node_modules/rc-slider/lib/common/Steps.js",
"feat_repo_name": "maty21/statistisc",
"text": "'use strict';\n\nObject.defineProperty(exports, \"__esModule\", {\n value: true\n});\n\nvar _defineProperty2 = require('babel-runtime/helpers/defineProperty');\n\nvar _defineProperty3 = _interopRequireDefault(_defineProperty2);\n\nvar _react = require('react');\n\nvar _react2 = _interopRequireDefault(_react);\n\nvar _classnames = require('classnames');\n\nvar _classnames2 = _interopRequireDefault(_classnames);\n\nvar _warning = require('warning');\n\nvar _warning2 = _interopRequireDefault(_warning);\n\nfunction _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { \"default\": obj }; }\n\nvar calcPoints = function calcPoints(vertical, marks, dots, step, min, max) {\n (0, _warning2[\"default\"])(dots ? step > 0 : true, '`Slider[step]` should be a positive number in order to make Slider[dots] work.');\n var points = Object.keys(marks).map(parseFloat);\n if (dots) {\n for (var i = min; i <= max; i = i + step) {\n if (points.indexOf(i) >= 0) continue;\n points.push(i);\n }\n }\n return points;\n};\n\nvar Steps = function Steps(_ref) {\n var prefixCls = _ref.prefixCls,\n vertical = _ref.vertical,\n marks = _ref.marks,\n dots = _ref.dots,\n step = _ref.step,\n included = _ref.included,\n lowerBound = _ref.lowerBound,\n upperBound = _ref.upperBound,\n max = _ref.max,\n min = _ref.min;\n\n var range = max - min;\n var elements = calcPoints(vertical, marks, dots, step, min, max).map(function (point) {\n var _classNames;\n\n var offset = Math.abs(point - min) / range * 100 + '%';\n var style = vertical ? { bottom: offset } : { left: offset };\n\n var isActived = !included && point === upperBound || included && point <= upperBound && point >= lowerBound;\n var pointClassName = (0, _classnames2[\"default\"])((_classNames = {}, (0, _defineProperty3[\"default\"])(_classNames, prefixCls + '-dot', true), (0, _defineProperty3[\"default\"])(_classNames, prefixCls + '-dot-active', isActived), _classNames));\n\n return _react2[\"default\"].createElement('span', { className: pointClassName, style: style, key: point });\n });\n\n return _react2[\"default\"].createElement(\n 'div',\n { className: prefixCls + '-step' },\n elements\n );\n};\n\nexports[\"default\"] = Steps;\nmodule.exports = exports['default'];"
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"target": "Value(dtype='string', id=None)",
"feat_repo_name": "Value(dtype='string', id=None)",
"text": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 80000 |
| valid | 20000 |
|
jorgeortizfuentes/chilean-spanish-corpus | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- es
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
size_categories:
- 10M<n<100M
source_datasets:
- original
task_categories:
- text-generation
- fill-mask
pretty_name: Chilean Spanish Corpus
dataset_info:
features:
- name: text
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 31427795307.483433
num_examples: 37126025
download_size: 18718981152
dataset_size: 31427795307.483433
---
# Chilean Spanish Corpus
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
Chilean Spanish
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@jorgeortizfuentes](https://github.com/jorgeortizfuentes) for adding this dataset. |
KnutJaegersberg/webglm_dataset | ---
license: cc-by-nc-4.0
---
|
open-llm-leaderboard/details_0-hero__Matter-0.1-Slim-7B | ---
pretty_name: Evaluation run of 0-hero/Matter-0.1-Slim-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [0-hero/Matter-0.1-Slim-7B](https://huggingface.co/0-hero/Matter-0.1-Slim-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_0-hero__Matter-0.1-Slim-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-13T22:23:44.167497](https://huggingface.co/datasets/open-llm-leaderboard/details_0-hero__Matter-0.1-Slim-7B/blob/main/results_2024-03-13T22-23-44.167497.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.605490033884534,\n\
\ \"acc_stderr\": 0.03326571312658935,\n \"acc_norm\": 0.6102815752899539,\n\
\ \"acc_norm_stderr\": 0.0339481142869132,\n \"mc1\": 0.2778457772337821,\n\
\ \"mc1_stderr\": 0.015680929364024647,\n \"mc2\": 0.4178982620141241,\n\
\ \"mc2_stderr\": 0.014144240439100468\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5733788395904437,\n \"acc_stderr\": 0.014453185592920293,\n\
\ \"acc_norm\": 0.6049488054607508,\n \"acc_norm_stderr\": 0.014285898292938169\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6183031268671579,\n\
\ \"acc_stderr\": 0.004848099661619697,\n \"acc_norm\": 0.8132842063333997,\n\
\ \"acc_norm_stderr\": 0.003888868099629071\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395269,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395269\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.03265019475033582,\n\
\ \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.03265019475033582\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155254,\n \"\
acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155254\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.667741935483871,\n \"acc_stderr\": 0.026795560848122794,\n \"\
acc_norm\": 0.667741935483871,\n \"acc_norm_stderr\": 0.026795560848122794\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"\
acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229862,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229862\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6102564102564103,\n \"acc_stderr\": 0.024726967886647074,\n\
\ \"acc_norm\": 0.6102564102564103,\n \"acc_norm_stderr\": 0.024726967886647074\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948496,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6050420168067226,\n \"acc_stderr\": 0.03175367846096625,\n \
\ \"acc_norm\": 0.6050420168067226,\n \"acc_norm_stderr\": 0.03175367846096625\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3973509933774834,\n \"acc_stderr\": 0.0399552400768168,\n \"acc_norm\"\
: 0.3973509933774834,\n \"acc_norm_stderr\": 0.0399552400768168\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.781651376146789,\n\
\ \"acc_stderr\": 0.01771260052872272,\n \"acc_norm\": 0.781651376146789,\n\
\ \"acc_norm_stderr\": 0.01771260052872272\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n\
\ \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.02955429260569506,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.02955429260569506\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7468354430379747,\n \"acc_stderr\": 0.0283046579430353,\n \
\ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.0283046579430353\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.04284467968052194,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.04284467968052194\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.023902325549560396,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.023902325549560396\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411022,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.04793724854411022\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.776500638569604,\n\
\ \"acc_stderr\": 0.01489723522945071,\n \"acc_norm\": 0.776500638569604,\n\
\ \"acc_norm_stderr\": 0.01489723522945071\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6560693641618497,\n \"acc_stderr\": 0.025574123786546655,\n\
\ \"acc_norm\": 0.6560693641618497,\n \"acc_norm_stderr\": 0.025574123786546655\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3005586592178771,\n\
\ \"acc_stderr\": 0.015334566806251154,\n \"acc_norm\": 0.3005586592178771,\n\
\ \"acc_norm_stderr\": 0.015334566806251154\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.026716118380156847,\n\
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.026716118380156847\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n\
\ \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n\
\ \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6635802469135802,\n \"acc_stderr\": 0.02628973494595293,\n\
\ \"acc_norm\": 0.6635802469135802,\n \"acc_norm_stderr\": 0.02628973494595293\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829727,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829727\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4282920469361147,\n\
\ \"acc_stderr\": 0.012638223880313168,\n \"acc_norm\": 0.4282920469361147,\n\
\ \"acc_norm_stderr\": 0.012638223880313168\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.02934980313976587,\n\
\ \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.02934980313976587\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6241830065359477,\n \"acc_stderr\": 0.01959402113657744,\n \
\ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.01959402113657744\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n\
\ \"acc_stderr\": 0.030769444967296014,\n \"acc_norm\": 0.746268656716418,\n\
\ \"acc_norm_stderr\": 0.030769444967296014\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03126781714663179,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03126781714663179\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2778457772337821,\n\
\ \"mc1_stderr\": 0.015680929364024647,\n \"mc2\": 0.4178982620141241,\n\
\ \"mc2_stderr\": 0.014144240439100468\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7734806629834254,\n \"acc_stderr\": 0.011764149054698332\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3813495072024261,\n \
\ \"acc_stderr\": 0.013379089877400751\n }\n}\n```"
repo_url: https://huggingface.co/0-hero/Matter-0.1-Slim-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|arc:challenge|25_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|gsm8k|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hellaswag|10_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T22-23-44.167497.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-13T22-23-44.167497.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- '**/details_harness|winogrande|5_2024-03-13T22-23-44.167497.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-13T22-23-44.167497.parquet'
- config_name: results
data_files:
- split: 2024_03_13T22_23_44.167497
path:
- results_2024-03-13T22-23-44.167497.parquet
- split: latest
path:
- results_2024-03-13T22-23-44.167497.parquet
---
# Dataset Card for Evaluation run of 0-hero/Matter-0.1-Slim-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [0-hero/Matter-0.1-Slim-7B](https://huggingface.co/0-hero/Matter-0.1-Slim-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_0-hero__Matter-0.1-Slim-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-13T22:23:44.167497](https://huggingface.co/datasets/open-llm-leaderboard/details_0-hero__Matter-0.1-Slim-7B/blob/main/results_2024-03-13T22-23-44.167497.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.605490033884534,
"acc_stderr": 0.03326571312658935,
"acc_norm": 0.6102815752899539,
"acc_norm_stderr": 0.0339481142869132,
"mc1": 0.2778457772337821,
"mc1_stderr": 0.015680929364024647,
"mc2": 0.4178982620141241,
"mc2_stderr": 0.014144240439100468
},
"harness|arc:challenge|25": {
"acc": 0.5733788395904437,
"acc_stderr": 0.014453185592920293,
"acc_norm": 0.6049488054607508,
"acc_norm_stderr": 0.014285898292938169
},
"harness|hellaswag|10": {
"acc": 0.6183031268671579,
"acc_stderr": 0.004848099661619697,
"acc_norm": 0.8132842063333997,
"acc_norm_stderr": 0.003888868099629071
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395269,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395269
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155254,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155254
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.667741935483871,
"acc_stderr": 0.026795560848122794,
"acc_norm": 0.667741935483871,
"acc_norm_stderr": 0.026795560848122794
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229862,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229862
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6102564102564103,
"acc_stderr": 0.024726967886647074,
"acc_norm": 0.6102564102564103,
"acc_norm_stderr": 0.024726967886647074
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948496,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6050420168067226,
"acc_stderr": 0.03175367846096625,
"acc_norm": 0.6050420168067226,
"acc_norm_stderr": 0.03175367846096625
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.0399552400768168,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.0399552400768168
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.781651376146789,
"acc_stderr": 0.01771260052872272,
"acc_norm": 0.781651376146789,
"acc_norm_stderr": 0.01771260052872272
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.02955429260569506,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.02955429260569506
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.0283046579430353,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.0283046579430353
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.04284467968052194,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.04284467968052194
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560396,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560396
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411022,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411022
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.776500638569604,
"acc_stderr": 0.01489723522945071,
"acc_norm": 0.776500638569604,
"acc_norm_stderr": 0.01489723522945071
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6560693641618497,
"acc_stderr": 0.025574123786546655,
"acc_norm": 0.6560693641618497,
"acc_norm_stderr": 0.025574123786546655
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3005586592178771,
"acc_stderr": 0.015334566806251154,
"acc_norm": 0.3005586592178771,
"acc_norm_stderr": 0.015334566806251154
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.026716118380156847,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.026716118380156847
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6635802469135802,
"acc_stderr": 0.02628973494595293,
"acc_norm": 0.6635802469135802,
"acc_norm_stderr": 0.02628973494595293
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829727,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829727
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4282920469361147,
"acc_stderr": 0.012638223880313168,
"acc_norm": 0.4282920469361147,
"acc_norm_stderr": 0.012638223880313168
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.01959402113657744,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.01959402113657744
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.02879518557429129,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.02879518557429129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.746268656716418,
"acc_stderr": 0.030769444967296014,
"acc_norm": 0.746268656716418,
"acc_norm_stderr": 0.030769444967296014
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03126781714663179,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03126781714663179
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2778457772337821,
"mc1_stderr": 0.015680929364024647,
"mc2": 0.4178982620141241,
"mc2_stderr": 0.014144240439100468
},
"harness|winogrande|5": {
"acc": 0.7734806629834254,
"acc_stderr": 0.011764149054698332
},
"harness|gsm8k|5": {
"acc": 0.3813495072024261,
"acc_stderr": 0.013379089877400751
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
iamplus/Conversation_Repo | ---
license: apache-2.0
---
Datasets :
1. ShareGPT **(https://huggingface.co/datasets/RyokoAI/ShareGPT52K)** - ***https://huggingface.co/datasets/manojpreveen/ConversationalRepo/tree/main/sharegpt-raw***
2. OpenAssistant **(https://huggingface.co/datasets/OpenAssistant/oasst1 -> https://huggingface.co/datasets/h2oai/openassistant_oasst1)** - ***https://huggingface.co/datasets/manojpreveen/ConversationalRepo/tree/main/OpenAssistant***
3. ultrachat **(https://huggingface.co/datasets/stingning/ultrachat)** - ***https://huggingface.co/datasets/manojpreveen/ConversationalRepo/tree/main/ultrachat***
4. baize **(https://github.com/project-baize/baize-chatbot)** - ***https://huggingface.co/datasets/manojpreveen/ConversationalRepo/tree/main/baize***
5. camel **(https://huggingface.co/datasets/camel-ai/ai_society, https://huggingface.co/datasets/camel-ai/code)** - ***https://huggingface.co/datasets/manojpreveen/ConversationalRepo/tree/main/camel***
6. roleplay **(Extended version of https://huggingface.co/datasets/fka/awesome-chatgpt-prompts)** - ***https://huggingface.co/datasets/manojpreveen/ConversationalRepo/tree/main/roleplay*** |
pszemraj/booksum-1024-output | ---
source_datasets: kmfoda/booksum
license:
- bsd-3-clause
train-eval-index:
- config: pszemraj--booksum_1024
task: summarization
task_id: summarization
splits:
eval_split: test
col_mapping:
chapter: text
summary_text: target
task_categories:
- summarization
- text2text-generation
size_categories:
- 1K<n<10K
---
# booksum - 1024 tokens max output
**goal:** limit max output length explicitly to prevent partial summaries being generated.
- [notebook](https://colab.research.google.com/gist/pszemraj/7f7b66d535441a9d0e8419fde1e1c98a/booksum-1024.ipynb) to create
## info
 |
Jing24/low_all_train | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
splits:
- name: train
num_bytes: 79656651
num_examples: 87599
download_size: 14271933
dataset_size: 79656651
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "low_all_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
spongus/milly-images | ---
license: unlicense
tags:
- image
- cat
- silly
- calico
pretty_name: Milly Images
task_categories:
- text-to-image
- image-classification
- image-segmentation
language:
- en
size_categories:
- n<1K
---
A collection of images from a very silly cat, these are all from @fatfatmillycat in twitter. Intended to be used with stable-diffusion-v1-4 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.