datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
bigbio/pubtator_central |
---
language:
- en
bigbio_language:
- English
license: other
multilinguality: monolingual
bigbio_license_shortname: NCBI_LICENSE
pretty_name: PubTator Central
homepage: https://www.ncbi.nlm.nih.gov/research/pubtator/
bigbio_pubmed: True
bigbio_public: True
bigbio_tasks:
- NAMED_ENTITY_RECOGNITION
- NAMED_ENTITY_DISAMBIGUATION
---
# Dataset Card for PubTator Central
## Dataset Description
- **Homepage:** https://www.ncbi.nlm.nih.gov/research/pubtator/
- **Pubmed:** True
- **Public:** True
- **Tasks:** NER,NED
PubTator Central (PTC, https://www.ncbi.nlm.nih.gov/research/pubtator/) is a web service for
exploring and retrieving bioconcept annotations in full text biomedical articles. PTC provides
automated annotations from state-of-the-art text mining systems for genes/proteins, genetic
variants, diseases, chemicals, species and cell lines, all available for immediate download. PTC
annotates PubMed (30 million abstracts), the PMC Open Access Subset and the Author Manuscript
Collection (3 million full text articles). Updated entity identification methods and a
disambiguation module based on cutting-edge deep learning techniques provide increased accuracy.
## Citation Information
```
@article{10.1093/nar/gkz389,
title = {{PubTator central: automated concept annotation for biomedical full text articles}},
author = {Wei, Chih-Hsuan and Allot, Alexis and Leaman, Robert and Lu, Zhiyong},
year = 2019,
month = {05},
journal = {Nucleic Acids Research},
volume = 47,
number = {W1},
pages = {W587-W593},
doi = {10.1093/nar/gkz389},
issn = {0305-1048},
url = {https://doi.org/10.1093/nar/gkz389},
eprint = {https://academic.oup.com/nar/article-pdf/47/W1/W587/28880193/gkz389.pdf}
}
```
|
reciprocate/alpaca-eval | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: selected
dtype: string
- name: rejected
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 14630155.18757764
num_examples: 9418
- name: test
num_bytes: 1626435.8124223603
num_examples: 1047
download_size: 7916104
dataset_size: 16256591.0
---
# Dataset Card for "alpaca-eval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DataStudio/OCR_Red | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 94279480.25
num_examples: 3550
download_size: 94231656
dataset_size: 94279480.25
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pvduy/dpo_data_capy | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 264301004
num_examples: 45600
- name: test
num_bytes: 8556760
num_examples: 1964
download_size: 148360235
dataset_size: 272857764
---
# Dataset Card for "dpo_data_capy"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ouvic215/Test-Dataset-0222 | ---
dataset_info:
features:
- name: mask_image
dtype: image
- name: text
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 147332332.0
num_examples: 1588
download_size: 146499523
dataset_size: 147332332.0
---
# Dataset Card for "Test-Dataset-0222"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
biglab/webui-test | ---
license: other
---
This data accompanies the WebUI project (https://dl.acm.org/doi/abs/10.1145/3544548.3581158)
For more information, check out the project website: https://uimodeling.github.io/
To download this dataset, you need to install the huggingface-hub package
```
pip install huggingface-hub
```
Use snapshot_download
```
from huggingface_hub import snapshot_download
snapshot_download(repo_id="biglab/webui-test", repo_type="dataset")
```
IMPORTANT
* Before downloading and using, please review the copyright info here: https://github.com/js0nwu/webui/blob/main/COPYRIGHT.txt
* Not all data samples have the same number of files (e.g., same number of device screenshots) due to the fact that the crawler used a timeout during collection
* The dataset released on HuggingFace was filtered using a list of explicit words and therefore contains fewer samples than the experiments originally used in the paper. The raw dataset is currently available (https://drive.google.com/drive/folders/1hcO75W2FjsZoibsj2TIbKz67hy9JkOBz?usp=share_link) but may be removed in the future. |
kanishka/counterfactual-babylm-only_random_removal | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 581653979
num_examples: 11605527
- name: validation
num_bytes: 56120230
num_examples: 1026747
download_size: 421391359
dataset_size: 637774209
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
2ndBestKiller/DrugTest | ---
license: unknown
task_categories:
- token-classification
language:
- de
tags:
- medical
size_categories:
- 1K<n<10K
---
There is also a version with classLabels here:
2ndBestKiller/DrugTestWithClassLabels
The set consists of a mix of german wikipedia articles, medical guidelines, medication analyses, patient informations and AI generated text.
The dataset has been auto annotated with a phrase matcher (exact matching) with a list of all avaivable medical substances in germany
as listed by the BfArM (though it does not include brand names). Annotations have not been checked manually! |
Rahmaa/SciTLDR_ClEaN | ---
license: openrail
---
|
hle2000/Mintaka_Graph_Features_Updated_T5-xl-ssm | ---
dataset_info:
features:
- name: question
dtype: string
- name: question_answer
dtype: string
- name: num_nodes
dtype: int64
- name: num_edges
dtype: int64
- name: density
dtype: float64
- name: cycle
dtype: int64
- name: bridge
dtype: int64
- name: katz_centrality
dtype: float64
- name: page_rank
dtype: float64
- name: avg_ssp_length
dtype: float64
- name: determ_sequence
dtype: string
- name: gap_sequence
dtype: string
- name: g2t_sequence
dtype: string
- name: determ_sequence_embedding
dtype: string
- name: gap_sequence_embedding
dtype: string
- name: g2t_sequence_embedding
dtype: string
- name: question_answer_embedding
dtype: string
- name: tfidf_vector
dtype: string
- name: correct
dtype: float64
splits:
- name: train
num_bytes: 9765642932
num_examples: 86381
- name: test
num_bytes: 2442628533
num_examples: 21574
download_size: 2702676747
dataset_size: 12208271465
---
# Dataset Card for "Mintaka_Graph_Features_Updated_T5-xl-ssm"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dhuynh95/Magicoder-Evol-Instruct-500-CodeLlama-70b-tokenized-0.5-Special-Token | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1122996
num_examples: 500
download_size: 575407
dataset_size: 1122996
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kgr123/quality_counter_5120_4_uniq | ---
dataset_info:
features:
- name: context
dtype: string
- name: word
dtype: string
- name: claim
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 558587914
num_examples: 20000
- name: validation
num_bytes: 221539952
num_examples: 8000
- name: test
num_bytes: 56238158
num_examples: 2300
download_size: 26660389
dataset_size: 836366024
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
HelloImSteven/applescript-lines-annotated | ---
dataset_info:
features:
- name: text
dtype: string
- name: source
dtype: string
- name: type
dtype: string
- name: intents
sequence: string
- name: tags
sequence: string
- name: description
dtype: string
- name: customTerms
sequence: string
- name: main_prompt
dtype: string
- name: other_prompts
sequence: string
splits:
- name: train
num_bytes: 345695.0
num_examples: 510
download_size: 123493
dataset_size: 345695.0
license: mit
task_categories:
- summarization
- text-generation
- text2text-generation
language:
- en
tags:
- applescript
- code
pretty_name: ASLines
size_categories:
- n<1K
---
# Dataset Card for "applescript-lines-annotated"
## Description
This is a dataset of single lines of AppleScript code scraped from GitHub and GitHub Gist and manually annotated with descriptions, intents, prompts, and other metadata.
## Content
Each row contains 8 features:
- `text` - The raw text of the AppleScript code.
- `source` - The name of the file from which the line originates.
- `type` - Either `compiled` (files using the `.scpt` extension) or `uncompiled` (everything else).
- `intents` - A list of intents the line invokes. See [Intents](#intents) for more info.
- `tags` - A list of tags associated with the line. See [Tags](#tags) for more info.
- `description` - One or more sentences describing what the line does, what its purpose is, and other relevant context.
- `customTerms` - A list of the custom terms used in the line, such as variable or handler names.
- `main_prompt` - A relevant prompt specific to the line.
- `other_prompts` - A list of prompts relevant to the line (but not necessarily specific to it).
### Intents
Intents describe the actions carried out by a line of code, i.e. what the line *does*. All intents used are listed below.
| Intent | Example Line |
| ----- | ----- |
| set property | `property myProperty: 5` |
| set variable | `set myVariable to 5` |
| begin handler definition | `on makePDF(title, content)` |
| end handler definition | `end makePDF` |
| call handler | `my makePDF("Example Title", "Example content") |
| perform action on script execution | `on run` |
| access value of property | `log myProperty` |
| access value of variable | `log myVariable` |
| get substring | `text 2 thru end of "Hello"` |
| concatenate strings | "Hello" & " world" |
| check condition | `if x > 4 then` |
| end condition | `end if` |
| begin instructions | `tell application "System Events"` |
| end instructions | `end tell` |
| interact with user interface | `click at {100, 200}` |
| pause | `delay 2` |
| begin error handling | `try` |
| end error handling | `end try` |
| perform action | `open location "https://google.com"` |
| begin repetition | `repeat with i from 1 thru 5` |
| end repetition | `end repeat` |
| filter list | `set t to tracks whose unplayed is true` |
| return | `return 5` |
| import library | `use framework "Foundation"` |
| display UI element | `display dialog "Test"` |
| open file | `set f to open for access filePath` |
| close file | `close access f` |
| begin script definition | `script myScript` |
| end script definition | `end script` |
| declare variable | `local x, y` |
| handle error | `on error err` |
### Tags
Tags described what a line *is* or what it *contains*. All tags used are listed below.
- contains handler
- contains list
- contains property
- contains variable
- start of block
- complete statement
- contains raw text
- contains location specifier
- contains condition
- contains number
- end of block
- contains boolean
- gui scripting
- contains comment
- contains cast
- AsOBjC
- shebang
- contains script object
- contains record
## Usage
This dataset was created for the AppleScript-Summarizer model as a personal project, but it can be used by others for any purpose. |
Asap7772/Math-Shepherd | ---
dataset_info:
features:
- name: question
dtype: string
- name: steps
sequence: string
- name: steps_noprefix
sequence: string
- name: steps_label
sequence: string
- name: dense_reward
sequence: int64
- name: sparse_reward
sequence: int64
- name: input
dtype: string
- name: label
dtype: string
- name: task
dtype: string
splits:
- name: train
num_bytes: 1296524890.0222304
num_examples: 399748
- name: test
num_bytes: 144060122.97776952
num_examples: 44417
download_size: 677837070
dataset_size: 1440585013.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_TheTravellingEngineer__bloom-560m-RLHF-v2 | ---
pretty_name: Evaluation run of TheTravellingEngineer/bloom-560m-RLHF-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheTravellingEngineer/bloom-560m-RLHF-v2](https://huggingface.co/TheTravellingEngineer/bloom-560m-RLHF-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheTravellingEngineer__bloom-560m-RLHF-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-21T18:07:38.079229](https://huggingface.co/datasets/open-llm-leaderboard/details_TheTravellingEngineer__bloom-560m-RLHF-v2/blob/main/results_2023-10-21T18-07-38.079229.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0016778523489932886,\n\
\ \"em_stderr\": 0.00041913301788268527,\n \"f1\": 0.03876782718120811,\n\
\ \"f1_stderr\": 0.00113779684793395,\n \"acc\": 0.2549173544570191,\n\
\ \"acc_stderr\": 0.007404160104110119\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.00041913301788268527,\n\
\ \"f1\": 0.03876782718120811,\n \"f1_stderr\": 0.00113779684793395\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \
\ \"acc_stderr\": 0.0007581501137225266\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5090765588003157,\n \"acc_stderr\": 0.01405017009449771\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheTravellingEngineer/bloom-560m-RLHF-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|arc:challenge|25_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_21T18_07_38.079229
path:
- '**/details_harness|drop|3_2023-10-21T18-07-38.079229.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-21T18-07-38.079229.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_21T18_07_38.079229
path:
- '**/details_harness|gsm8k|5_2023-10-21T18-07-38.079229.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-21T18-07-38.079229.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hellaswag|10_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T14:22:38.044198.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T14:22:38.044198.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T14:22:38.044198.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_21T18_07_38.079229
path:
- '**/details_harness|winogrande|5_2023-10-21T18-07-38.079229.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-21T18-07-38.079229.parquet'
- config_name: results
data_files:
- split: 2023_08_09T14_22_38.044198
path:
- results_2023-08-09T14:22:38.044198.parquet
- split: 2023_10_21T18_07_38.079229
path:
- results_2023-10-21T18-07-38.079229.parquet
- split: latest
path:
- results_2023-10-21T18-07-38.079229.parquet
---
# Dataset Card for Evaluation run of TheTravellingEngineer/bloom-560m-RLHF-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheTravellingEngineer/bloom-560m-RLHF-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheTravellingEngineer/bloom-560m-RLHF-v2](https://huggingface.co/TheTravellingEngineer/bloom-560m-RLHF-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheTravellingEngineer__bloom-560m-RLHF-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-21T18:07:38.079229](https://huggingface.co/datasets/open-llm-leaderboard/details_TheTravellingEngineer__bloom-560m-RLHF-v2/blob/main/results_2023-10-21T18-07-38.079229.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788268527,
"f1": 0.03876782718120811,
"f1_stderr": 0.00113779684793395,
"acc": 0.2549173544570191,
"acc_stderr": 0.007404160104110119
},
"harness|drop|3": {
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788268527,
"f1": 0.03876782718120811,
"f1_stderr": 0.00113779684793395
},
"harness|gsm8k|5": {
"acc": 0.000758150113722517,
"acc_stderr": 0.0007581501137225266
},
"harness|winogrande|5": {
"acc": 0.5090765588003157,
"acc_stderr": 0.01405017009449771
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
odunola/experiment-yoruba-data | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 72929372.36450547
num_examples: 200
download_size: 77381558
dataset_size: 72929372.36450547
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pythainlp/wisesight_sentiment_prompt | ---
language:
- th
license: cc0-1.0
size_categories:
- 10K<n<100K
task_categories:
- text-generation
- text2text-generation
pretty_name: i
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 10132750
num_examples: 16194
- name: validation
num_bytes: 1118295
num_examples: 1777
- name: test
num_bytes: 1240521
num_examples: 1965
download_size: 3093175
dataset_size: 12491566
tags:
- instruct-fellow
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
wisesight_sentiment_prompt is the instruct fellow dataset for sentiment Thai text by prompt. It can use fine-tuning model.
- inputs: Prompt
- targets: Text targets that AI should answer.
**Template**
```
Inputs: จำแนกประโยคต่อไปนี้เป็นคำถามหรือข้อความเชิงบวก/เป็นกลาง/เชิงลบ:\n{text}
targets: ประโยคที่กำหนดสามารถจำแนกข้อความได้เป็นข้อความ{category}
```
category
- คำถาม: question
- เชิงบวก: positive
- เป็นกลาง: neutral
- เชิงลบ: negative
Notebook that used create this dataset: [https://github.com/PyThaiNLP/support-aya-datasets/blob/main/sentiment-analysis/wisesight_sentiment.ipynb](https://github.com/PyThaiNLP/support-aya-datasets/blob/main/sentiment-analysis/wisesight_sentiment.ipynb)
Wisesight Sentiment Corpus: Social media messages in Thai language with sentiment category (positive, neutral, negative, question)
* Released to public domain under Creative Commons Zero v1.0 Universal license.
* Size: 26,737 messages
* Language: Central Thai
* Style: Informal and conversational. With some news headlines and advertisement.
* Time period: Around 2016 to early 2019. With small amount from other period.
* Domains: Mixed. Majority are consumer products and services (restaurants, cosmetics, drinks, car, hotels), with some current affairs.
See more: [wisesight_sentiment](https://huggingface.co/datasets/wisesight_sentiment).
PyThaiNLP |
rohanmahen/phrase-ticker | ---
license: mit
---
# phrase-ticker Dataset
## Description
The Phrase Ticker Dataset enables the extraction of stock ticker symbols from natural language queries. The dataset pairs NL utterances commonly associated with S&P 500 companies with their corresponding ticker symbols, providing a simple resource for understanding how companies are referred to in various contexts.
## Structure
The dataset comprises two columns:
- `phrase`: This column contains natural language phrases that reference or describe companies in ways that are commonly used in financial news, reports, and discussions. These include not only formal company names and products but also informal and colloquial references.
- `ticker`: Each phrase is associated with a unique stock ticker symbol, identifying the company mentioned or described in the phrase.
## Primary Use Case
**Ticker Extraction from Natural Language Queries**: The main application of this dataset is to train models that can accurately identify and extract stock ticker symbols from text. This capability is crucial for automating the analysis of financial news, social media mentions, analyst reports, and any textual content where companies are discussed without directly mentioning their ticker symbols.
## Getting Started
To begin working with the phrase-ticker Dataset in your projects, you can load it using the Hugging Face `datasets` library:
```python
from datasets import load_dataset
dataset = load_dataset("rohanmahen/phrase-ticker")
```
## Contributions
Contributions to the phrase-ticker Dataset are welcomed, including the addition of new phrases, refinement of existing data, and suggestions for improvement. Please checkout the repository on [github](https://github.com/rohanmahen/phrase-ticker) for more info.
|
CyberHarem/koga_koharu_theidolmastercinderellagirlsu149 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Koga Koharu
This is the dataset of Koga Koharu, containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 460 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 460 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 460 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 460 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
Bibek1129/nepali_SQuAD | ---
license: cc-by-4.0
---
|
ruanchaves/assin_por_Latn_to_spa_Latn | ---
dataset_info:
features:
- name: sentence_pair_id
dtype: int64
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: relatedness_score
dtype: float32
- name: entailment_judgment
dtype:
class_label:
names:
'0': NONE
'1': ENTAILMENT
'2': PARAPHRASE
- name: __language__
dtype: string
splits:
- name: train
num_bytes: 1052463
num_examples: 5000
- name: test
num_bytes: 820108
num_examples: 4000
- name: validation
num_bytes: 210810
num_examples: 1000
download_size: 0
dataset_size: 2083381
---
# Dataset Card for "assin_por_Latn_to_spa_Latn"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mehr4n-m/parsinlu-en-fa-structrual-edit | ---
license: cc-by-nc-sa-4.0
---
|
autoevaluate/autoeval-staging-eval-samsum-samsum-70f55d-15546146 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- samsum
eval_info:
task: summarization
model: SamuelAllen1234/testing
metrics: ['rouge', 'mse', 'mae', 'squad']
dataset_name: samsum
dataset_config: samsum
dataset_split: validation
col_mapping:
text: dialogue
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: SamuelAllen1234/testing
* Dataset: samsum
* Config: samsum
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@SamuelAllen12345](https://huggingface.co/SamuelAllen12345) for evaluating this model. |
HuggingFaceM4/common_gen | Invalid username or password. |
NghiemAbe/sts13 | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
splits:
- name: test
num_bytes: 262456
num_examples: 1500
download_size: 128720
dataset_size: 262456
task_categories:
- sentence-similarity
language:
- vi
---
# Dataset Card for "sts13"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Metastability/humanet_data | ---
license: apache-2.0
---
|
YuehHanChen/VAL_mistral_7b | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 735931
num_examples: 316
download_size: 162835
dataset_size: 735931
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
thercyl/AMZN | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: float64
- name: Ticker
dtype: string
- name: Year
dtype: string
- name: Text
dtype: string
- name: Embedding
dtype: string
splits:
- name: train
num_bytes: 47912891
num_examples: 1375
download_size: 25877768
dataset_size: 47912891
---
# Dataset Card for "AMZE"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
maghwa/OpenHermes-2-AR-10K-9 | ---
dataset_info:
features:
- name: language
dtype: 'null'
- name: views
dtype: float64
- name: model_name
dtype: 'null'
- name: topic
dtype: 'null'
- name: hash
dtype: 'null'
- name: custom_instruction
dtype: 'null'
- name: skip_prompt_formatting
dtype: 'null'
- name: avatarUrl
dtype: 'null'
- name: conversations
dtype: string
- name: category
dtype: 'null'
- name: idx
dtype: 'null'
- name: id
dtype: 'null'
- name: title
dtype: 'null'
- name: system_prompt
dtype: 'null'
- name: model
dtype: 'null'
- name: source
dtype: string
splits:
- name: train
num_bytes: 19971229
num_examples: 10001
download_size: 8629780
dataset_size: 19971229
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pioivenium/marketov3-tokenized | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 1305220
num_examples: 2201
download_size: 434632
dataset_size: 1305220
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BangumiBase/eizoukenniwateodasuna | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Eizouken Ni Wa Te O Dasu Na!
This is the image base of bangumi Eizouken ni wa Te o Dasu na!, we detected 17 characters, 1057 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 235 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 290 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 225 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 16 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 28 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 38 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 30 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 23 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 12 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 13 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 12 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 10 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 12 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 8 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 42 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 10 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 53 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
Seanxh/twitter_dataset_1713103674 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 10610
num_examples: 26
download_size: 9037
dataset_size: 10610
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
palat/bort_wikipedia | ---
license:
- cc-by-sa-3.0
---
# BORT Wikipedia Data
This is the data used to prepare the [BORT](https://huggingface.co/palat/bort) model, described by the following paper:
Robert Gale, Alexandra C. Salem, Gerasimos Fergadiotis, and Steven Bedrick. 2023. [**Mixed Orthographic/Phonemic Language Modeling: Beyond Orthographically Restricted Transformers (BORT).**](https://robertcgale.com/pub/2023-acl-bort-paper.pdf) In Proceedings of the 8th Workshop on Representation Learning for NLP (RepL4NLP-2023), pages TBD, Online. Association for Computational Linguistics. [[paper]](https://robertcgale.com/pub/2023-acl-bort-paper.pdf) [[poster]](https://robertcgale.com/pub/2023-acl-bort-poster.pdf)
Additional resources and information can be found [here](https://github.com/rcgale/bort).
## Acknowledgements
This work was supported by the National Institute on Deafness and Other Communication Disorders of the National Institutes of Health under award 5R01DC015999 (Principal Investigators: Bedrick \& Fergadiotis). The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.
## Limitations
The models presented here were trained with the basic inventory of English phonemes found in CMUDict. However, a more fine-grained phonetic analysis would require a pronunciation dictionary with more narrowly defined entries. Additionally, while this paper focused on models trained with English-only resources (pre-trained BART-BASE, English Wikipedia text, CMUDict, and the English AphasiaBank), the techniques should be applicable to non-English language models as well. Finally, from a clinical standpoint, the model we describe in this paper assumes the existence of transcribed input (from either a manual or automated source, discussed in detail in §2.1 of the paper; in its current form, this represents a limitation to its clinical implementation, though not to its use in research settings with archival or newly-transcribed datasets.
## Ethics Statement
Our use of the AphasiaBank data was governed by the TalkBank consortium's data use agreement, and the underlying recordings were collected and shared with approval of the contributing sites' institutional review boards.
Limitations exist regarding accents and dialect, which in turn would affect the scenarios in which a system based on our model could (and should) be used.
It should also be noted that these models and any derived technology are not meant to be tools to diagnose medical conditions, a task best left to qualified clinicians.
## License Information
### Wikipedia License
The Wikipedia data was derived from the Huggingface [Wikipedia](https://huggingface.co/datasets/wikipedia) dataset.
That portion of the data is subject to the following license information:
> Most of Wikipedia's text and many of its images are co-licensed under the
[Creative Commons Attribution-ShareAlike 3.0 Unported License](https://en.wikipedia.org/wiki/Wikipedia:Text_of_Creative_Commons_Attribution-ShareAlike_3.0_Unported_License)
(CC BY-SA) and the [GNU Free Documentation License](https://en.wikipedia.org/wiki/Wikipedia:Text_of_the_GNU_Free_Documentation_License)
(GFDL) (unversioned, with no invariant sections, front-cover texts, or back-cover texts).
>
> Some text has been imported only under CC BY-SA and CC BY-SA-compatible license and cannot be reused under GFDL; such
text will be identified on the page footer, in the page history, or on the discussion page of the article that utilizes
the text.
### CMUDict License
Pronunciation dictionaries contained herein were adapted from [CMUDict](https://github.com/cmusphinx/cmudict), and as
such are subject to [their license](cmudict.license.txt). |
samchain/econo-pairs | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: sentenceA
dtype: string
- name: sentenceB
dtype: string
- name: label
dtype: float64
splits:
- name: train
num_bytes: 209357089
num_examples: 71582
- name: test
num_bytes: 69736578
num_examples: 23861
download_size: 162843573
dataset_size: 279093667
license: apache-2.0
task_categories:
- sentence-similarity
language:
- en
tags:
- economics
- finance
- politics
size_categories:
- 10K<n<100K
---
# Dataset Card for "econo-pairs"
Econo-pairs is a dataset made of pairs of sentences extracted from worldwide central banks speeches and other public financial institutions. Each pair is labelled as a positive (1) or negative (0) one.
Positive pairs are made of sentences extracted from the same speech. Negative pairs are made of sentences extracted from a random other speech.
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mandania/i-am-that-split | ---
dataset_info:
features:
- name: text
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 708164
num_examples: 1698
- name: test
num_bytes: 125231
num_examples: 300
download_size: 485640
dataset_size: 833395
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
autoevaluate/autoeval-eval-futin__feed-top_en_-3f631c-2246071668 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/feed
eval_info:
task: text_zero_shot_classification
model: facebook/opt-1.3b
metrics: []
dataset_name: futin/feed
dataset_config: top_en_
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-1.3b
* Dataset: futin/feed
* Config: top_en_
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
wange1002/000001 | ---
license: afl-3.0
---
|
open-llm-leaderboard/details_OpenBuddy__openbuddy-openllama-13b-v7-fp16 | ---
pretty_name: Evaluation run of OpenBuddy/openbuddy-openllama-13b-v7-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenBuddy/openbuddy-openllama-13b-v7-fp16](https://huggingface.co/OpenBuddy/openbuddy-openllama-13b-v7-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-openllama-13b-v7-fp16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-14T17:51:28.265681](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-openllama-13b-v7-fp16/blob/main/results_2023-10-14T17-51-28.265681.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.13496224832214765,\n\
\ \"em_stderr\": 0.00349915623734624,\n \"f1\": 0.19493917785234854,\n\
\ \"f1_stderr\": 0.0036402036609824453,\n \"acc\": 0.39774068872582313,\n\
\ \"acc_stderr\": 0.010563523906790405\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.13496224832214765,\n \"em_stderr\": 0.00349915623734624,\n\
\ \"f1\": 0.19493917785234854,\n \"f1_stderr\": 0.0036402036609824453\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09855951478392722,\n \
\ \"acc_stderr\": 0.008210320350946331\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.696921862667719,\n \"acc_stderr\": 0.012916727462634477\n\
\ }\n}\n```"
repo_url: https://huggingface.co/OpenBuddy/openbuddy-openllama-13b-v7-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_13T09_46_52.076737
path:
- '**/details_harness|drop|3_2023-10-13T09-46-52.076737.parquet'
- split: 2023_10_14T17_51_28.265681
path:
- '**/details_harness|drop|3_2023-10-14T17-51-28.265681.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-14T17-51-28.265681.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_13T09_46_52.076737
path:
- '**/details_harness|gsm8k|5_2023-10-13T09-46-52.076737.parquet'
- split: 2023_10_14T17_51_28.265681
path:
- '**/details_harness|gsm8k|5_2023-10-14T17-51-28.265681.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-14T17-51-28.265681.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_13T09_46_52.076737
path:
- '**/details_harness|winogrande|5_2023-10-13T09-46-52.076737.parquet'
- split: 2023_10_14T17_51_28.265681
path:
- '**/details_harness|winogrande|5_2023-10-14T17-51-28.265681.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-14T17-51-28.265681.parquet'
- config_name: results
data_files:
- split: 2023_10_13T09_46_52.076737
path:
- results_2023-10-13T09-46-52.076737.parquet
- split: 2023_10_14T17_51_28.265681
path:
- results_2023-10-14T17-51-28.265681.parquet
- split: latest
path:
- results_2023-10-14T17-51-28.265681.parquet
---
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-openllama-13b-v7-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OpenBuddy/openbuddy-openllama-13b-v7-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-openllama-13b-v7-fp16](https://huggingface.co/OpenBuddy/openbuddy-openllama-13b-v7-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-openllama-13b-v7-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-14T17:51:28.265681](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-openllama-13b-v7-fp16/blob/main/results_2023-10-14T17-51-28.265681.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.13496224832214765,
"em_stderr": 0.00349915623734624,
"f1": 0.19493917785234854,
"f1_stderr": 0.0036402036609824453,
"acc": 0.39774068872582313,
"acc_stderr": 0.010563523906790405
},
"harness|drop|3": {
"em": 0.13496224832214765,
"em_stderr": 0.00349915623734624,
"f1": 0.19493917785234854,
"f1_stderr": 0.0036402036609824453
},
"harness|gsm8k|5": {
"acc": 0.09855951478392722,
"acc_stderr": 0.008210320350946331
},
"harness|winogrande|5": {
"acc": 0.696921862667719,
"acc_stderr": 0.012916727462634477
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
irds/trec-cast_v1 | ---
pretty_name: '`trec-cast/v1`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `trec-cast/v1`
The `trec-cast/v1` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/trec-cast#trec-cast/v1).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=38,622,444
This dataset is used by: [`trec-cast_v1_2020`](https://huggingface.co/datasets/irds/trec-cast_v1_2020), [`trec-cast_v1_2020_judged`](https://huggingface.co/datasets/irds/trec-cast_v1_2020_judged)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/trec-cast_v1', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Dalton2019Cast,
title={CAsT 2019: The Conversational Assistance Track Overview},
author={Jeffrey Dalton and Chenyan Xiong and Jamie Callan},
booktitle={TREC},
year={2019}
}
```
|
Chunshen/test | ---
license: mit
---
|
bpranto/ise | ---
license: mit
---
|
CyberHarem/maiden_nikke | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of maiden/メイデン/梅登/메이든 (Nikke: Goddess of Victory)
This is the dataset of maiden/メイデン/梅登/메이든 (Nikke: Goddess of Victory), containing 39 images and their tags.
The core tags of this character are `black_hair, breasts, long_hair, red_eyes, large_breasts, hair_ornament, hair_flower, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 39 | 61.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maiden_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 39 | 31.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maiden_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 97 | 67.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maiden_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 39 | 52.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maiden_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 97 | 103.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maiden_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/maiden_nikke',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, solo, cleavage, looking_at_viewer, rose, black_gloves, fingerless_gloves, mouth_mask, simple_background |
| 1 | 6 |  |  |  |  |  | 1girl, cleavage, fingerless_gloves, looking_at_viewer, nurse_cap, solo, thighhighs, white_gloves, belt, blush, open_mouth, white_dress, fishnets, holding, mouth_mask, short_dress, simple_background, syringe, thighs, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | cleavage | looking_at_viewer | rose | black_gloves | fingerless_gloves | mouth_mask | simple_background | nurse_cap | thighhighs | white_gloves | belt | blush | open_mouth | white_dress | fishnets | holding | short_dress | syringe | thighs | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------|:--------------------|:-------|:---------------|:--------------------|:-------------|:--------------------|:------------|:-------------|:---------------|:-------|:--------|:-------------|:--------------|:-----------|:----------|:--------------|:----------|:---------|:-------------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
kofiyo/reviews | ---
license: unlicense
---
|
sr5434/CodegebraGPT_data | ---
dataset_info:
- config_name: 100k-multimodal
features:
- name: conversations
struct:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: id
dtype: int64
- name: image
dtype: string
splits:
- name: train
num_bytes: 124335530
num_examples: 100000
download_size: 64289784
dataset_size: 124335530
- config_name: 100k-text
features:
- name: conversations
struct:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: id
dtype: int64
- name: image
dtype: string
splits:
- name: train
num_bytes: 124335530
num_examples: 100000
download_size: 64289784
dataset_size: 124335530
- config_name: full
features:
- name: conversations
struct:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: id
dtype: int64
- name: image
dtype: string
splits:
- name: train
num_bytes: 1305046195
num_examples: 1049253
download_size: 673964053
dataset_size: 1305046195
configs:
- config_name: 100k-multimodal
data_files:
- split: train
path: 100k-multimodal/train-*
- config_name: 100k-text
data_files:
- split: train
path: 100k-text/train-*
- config_name: full
data_files:
- split: train
path: full/train-*
license: mit
task_categories:
- conversational
language:
- en
tags:
- chemistry
- biology
- code
size_categories:
- 100K<n<1M
---
A collection of datasets for finetuning LLMs on STEM related tasks. The dataset is formatted in the [LLaVA finetuning format](https://github.com/haotian-liu/LLaVA/blob/main/docs/Finetune_Custom_Data.md#dataset-format). |
Venkatesh4342/indian-augmented-NER | ---
license: apache-2.0
---
|
xDAN-datasets/glaive_code_assistant_140K | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 417459108
num_examples: 136109
download_size: 0
dataset_size: 417459108
---
# Dataset Card for "glaive_code_assistant_140K"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Infinigence/LVEval | ---
license: mit
language:
- en
- zh
viewer: true
---
# 介绍(Introduction)
**LV-Eval**是一个具备5个长度等级(16k、32k、64k、128k和256k)、最大文本测试长度达到256k的长文本评测基准。**LV-Eval**的平均文本长度达到102,380字,最小/最大文本长度为11,896/387,406字。**LV-Eval**主要有两类评测任务——单跳QA和多跳QA,共包含11个涵盖中英文的评测数据子集。**LV-Eval**设计时引入3个关键技术:干扰事实插入(**C**onfusiong **F**acts **I**nsertion,CFI)提高挑战性,关键词和短语替换(**K**eyword and **P**hrase **R**eplacement,KPR)减少信息泄漏,以及基于关键词召回的评测指标(**A**nswer **K**eywords,AK,指代结合答案关键词和字词黑名单的评价指标)提高评测数值客观性。我们希望*LV*-Eval为未来长文本大语言模型的研究发展提供有价值的性能参考。
**LV-Eval**有以下关键特性:
* **超长文本长度**: **LV-Eval**由5个长度等级构成,分别是16k、32k、64k、128k以及256k。同一数据集在不同长度等级下具有相同的问答对集合,只是构成各长度等级的上下文长度不同。我们的目的是保持问答对一致的情况下,充分测试模型在不同长度等级上下文中的性能表现,更可控地评估模型的长文本能力。
* **结合混淆和干扰信息来提升评测难度**: 构建测试数据的过程中,我们将问答相关文档和无关文档混合拼接起来构成测试文档。该构建方式在扩展文本长度的同时,可有效评测模型从冗长混淆文本中提取关键信息的能力。此外,我们还使用GPT-4生成多个干扰信息,并在人工检查后随机插入到测试文档中,以评测模型在有相似事实描述的干扰下保持准确推理的能力。
* **替换数据中的关键信息以减少信息泄漏**: 为了解决长文本能力评测中由于信息泄漏而引起的指标虚高问题,我们采用关键词和短语替换的方式处理数据的上下文以及问答对,替换后的信息不再是公共知识,也在很大程度上与数据源的原始信息不同。所有的替换词和短语标注都由人类标注员完成。这样一来, **LV-Eval**能够严格要求被测模型根据数据中实际提供的上下文信息来回答问题,而非通过“背题”或者预训练阶段的常识记忆的方式来回答问题。
* **基于关键词召回的指标可更客观公正地评测模型性能**: 目前已有的评测指标(如F1分、ROUGH等)存在受回答格式和无关字词干扰的问题,容易导致评测结果虚高。为解决这个问题,我们人工标注了答案关键词和字词黑名单。答案关键词是从原始答案中提取的最具回答信息量的词汇或短语,而字词黑名单主要包含一些无信息量的代词、助词,比如“的”、“和”、“了”等。评测指标的计算被设计为两阶段过程,以F1分数为例:第一阶段先计算模型回答对答案关键词的召回分数,如果分数低于预设阈值,则直接计0分;如果召回分数高于阈值,则进一步计算模型回答与完整答案的F1分数——首先将字词黑名单中的词从回答和答案中过滤掉,再正常进行F1分数计算。这样一来,评测指标可使得模型得分更加客观公正。
如果您想了解更多关于**LV-Eval**的细节,我们建议您参阅[GitHub代码库](https://github.com/infinigence/LVEval)以及[论文](https://arxiv.org/abs/2402.05136)。
**LV-Eval** is a challenging long-context benchmark with five length levels (16k, 32k, 64k, 128k, and 256k) reaching up to 256k words. The average number of words is 102,380, and the Min/Max number of words is 11,896/387,406. **LV-Eval** features two main tasks, single-hop QA and multi-hop QA, comprising 11 bilingual datasets. The design of **LV-Eval** has incorporated three key techniques, namely confusing facts insertion (CFI), keyword and phrase replacement (KPR), and keyword-recall-based metrics (AK, short for metics with Answer Keywords and word blacklist) design, which jointly provide a challenging, mitigated-knowledge-leakege, and more accurate evaluation of the long-context capability of LLMs. We anticipate that **LV-Eval** will serve as a valuable resource for supporting future research on long-context LLMs.
The Key Characteristics of **LV-Eval** include:
* **Sufficiently long context length to evaluate state-of-the-art models**: **LV-Eval** comprises 5 length levels with word counts of 16k, 32k, 64k, 128k, and 256k. Test instances across these levels share the same set of question-answer (QA) pairs, and only differ in the context content and length. Testing on the same QA pairs with different context lengths facilitates a controllable evaluation of models' long-context ability.
* **Incorporation of distraction and confusion to increase difficulty**: When constructing the context for each test instance, we mix up distracting documents and supporting documents. This approach evaluates the model's ability in pinpointing key information in a large bunch of distracting texts. In addition, we insert confusing facts generated by GPT-4 and revised by human annotators into the context. This assesses the model's capability to accurately reason in the presence of interference.
* **Keyword and phrase replacement to mitigate knowledge leakage**: To mitigate the biased evaluation of long-context ability caused by knowledge leakage, we apply keyword and phrase replacement in the context and QA pairs. The replacement rules are annotated by human annotators. In this way, **LV-Eval** requires LLMs to rely on their understanding of the long context to answer questions rather than relying on memorization or common-sense knowledge.
* **Keyword-recall-based metric for more objective scoring**: Existing *N*-gram metrics such as the F1 score are sensitive to the format variations and non-informative words in the answer, which results in inaccurate scores. To address this, we manually annotate answer keywords and a blacklist of unrelated words. The answer keywords are the critical words or sentences extracted from original ground-truth (GT) answers, while the word blacklist contains common and non-informative words such as 'the', 'a', 'of', and so on. The metric calculation follows a two-stage procedure: the first stage calculates the recall of answer keywords; if the recall exceeds a certain threshold, the second stage will remove all the blacklisted words and then calculate the F1 score between the prediction and the GT answer. This metric design can get scores with higher objectivity.
If you want to learn more about **LV-Eval**, we recommend you to refer to the [GitHub repository](https://github.com/infinigence/LVEval) and the [paper](https://arxiv.org/abs/2402.05136).
# How to use it?
#### Quick Start
Our dataset evaluates the long-text capabilities of the large language models from multiple perspectives. Each subset has different length divisions, so please add a length limit when loading the dataset.
```
data = load_dataset("Infinigence/LVEval", "hotpotwikiqa_mixup_16k", split='test')
```
#### Loading Data
```python
from datasets import load_dataset
DATASET_NAMES = [
"hotpotwikiqa_mixup", "loogle_SD_mixup", "loogle_CR_mixup", "loogle_MIR_mixup", \
"multifieldqa_en_mixup", "multifieldqa_zh_mixup", "factrecall_en", "factrecall_zh", \
"cmrc_mixup", "lic_mixup", "dureader_mixup"
]
DATASET_LENGTH_LEVEL = [
'16k', '32k', '64k', '128k', '256k'
]
def get_dataset_names(dataset_names, length_levels):
datasets = []
for name in dataset_names:
for length in length_levels:
datasets.append(f"{name}_{length}")
return datasets
for dataset in get_dataset_names(DATASET_NAMES, DATASET_LENGTH_LEVEL):
data = load_dataset("Infinigence/LVEval", dataset, split='test')
```
If you want to download the data for **hotpotwikiqa_mixup**, you can visit [this link](https://huggingface.co/datasets/Infinigence/LVEval/resolve/main/hotpotwikiqa_mixup.zip).
If you need other subsets of data, simply change the zip file name in the link above.
#### Data Format
All data in **LV-Eval** follows the following format. For certain datasets ("loogle_SD_mixup," "loogle_CR_mixup," "loogle_MIR_mixup"), there is an additional key called "answer_keywords". This key indicates the most crucial word or sentence in the answer. During the evaluation of predicted values, if the match between the prediction and the "answer_keywords" falls below a certain threshold, it directly returns 0. Otherwise, it compares the "answers" list with the predicted value.
For some datasets ("factrecall_en," "factrecall_zh," "cmrc_mixup"), there is an extra key called "confusing_facts". This key represents confounding elements added to increase the benchmark difficulty and has been randomly placed within long texts.
For certain datasets ("hotpotwikiqa_mixup," "multifieldqa_en_mixup," "multifieldqa_zh_mixup," "lic_mixup"), both "answer_keywords" and "confusing_facts" are present.
```json
{
"input": "The input/command for the task, usually short, such as questions in QA, queries in Few-shot tasks, etc",
"context": "The documents input into the long-text task.",
"answers": "A List of all true answers",
"length": "Total length of the first three items (counted in characters for Chinese and words for English)",
"dataset": "The name of the dataset to which this piece of data belongs",
"language": "The language of this piece of data",
"answer_keywords": "The key words or sentences manually filtered from the answers.",
"confusing_facts": "This key represents confounding elements added to increase the benchmark difficulty and has been randomly placed within long texts. This helps make the test instances more challenging."
}
```
#### Evaluation
This repository provides data download for LV-Eval. If you wish to use this dataset for automated evaluation, please refer to our [github](https://github.com/infinigence/LVEval).
# Task statistics
| Task | Datasets | CFI | \#KPR | AK | Language | \#QA pairs | \#Contexts |
|:-------------:|:-----------------------:|:----------:|-------|:----------:|:--------:|:----------:|:------------:|
| Single-hop QA | loogle\_SD\_mixup | | | ✔ | en | 160 | 800 |
| | cmrc\_mixup | | 786 | | zh | 200 | 1,000 |
| | multifieldqa\_en\_mixup | ✔ | 476 | ✔ | en | 101 | 505 |
| | multifieldqa\_zh\_mixup | ✔ | 424 | ✔ | zh | 133 | 665 |
| | factrecall\_en | ✔ | 3 | ✔ | en | 1 | 200*5 |
| | factrecall\_zh | ✔ | 3 | ✔ | zh | 1 | 200*5 |
| Multi-hop QA | dureader\_mixup | | | | zh | 176 | 880 |
| | loogle\_CR\_mixup | | | ✔ | en | 99 | 495 |
| | loogle\_MR\_mixup | | | ✔ | en | 139 | 695 |
| | hotpotwikiqa\_mixup | ✔ | 232 | ✔ | en | 124 | 620 |
| | lic\_mixup | ✔ | | ✔ | zh | 197 | 985 |
The abbreviations for **CFI, KPR, AK** represent for confusing fact insertion, keyword and phrase replacement, and answer keywords, respectively. The confusing fact insertion has already been inserted into the context and will be displayed in the jsonl file as **"confusing_facts"**. The answer keywords will be shown in the form of **"answer_keywords"** in the jsonl file.
# Task construction
### Multi-hop QA
In a multi-hop QA task, the reasoning process to derive the answer need to gather multiple pieces of information from various locations in the context.
- **lic-mixup** is originated from the [Long-instruction-en2zh](https://huggingface.co/datasets/yuyijiong/Long-instruction-en2zh) dataset on Hugging Face. The original Long-instruction-en2zh contains 8,000+ high-quality Chinese multi-doc QA data translated from English. We selected 197 QA pairs and their corresponding documents as supporting data, while the remaining documents serve as distracting data for context mixing.
- **hotpotwikiqa-mixup** is originated from two Wikipedia-based multi-hop QA datasets: [HotpotQA](https://huggingface.co/datasets/hotpot_qa) and [2WikiMultihopQA](https://huggingface.co/datasets/voidful/2WikiMultihopQA). HotpotQA contains 112,779 2-hop questions that are written by native speakers according to two given paragraphs as the context. 2WikiMultihopQA contains 192,606 5-hop questions that are synthesized using manually designed templates to prevent shortcut solutions. We select 124 samples from the two datasets.
- **loogle-MR-mixup** and **loogle-CR-mixup** originate from [LooGLE](https://huggingface.co/datasets/bigainlco/LooGLE)'s Long-dependency QA task, specifically the *Multiple information Retrieval* and *Comprehension and Reasoning* subtasks. The *Multiple information Retrieval* task requires aggregation of the evidence that can be directly located in original sentences, while the *Comprehension and Reasoning* task contains implicit evidence within the context, it requires multi-step reasoning to get the correct answers. We select 139 and 99 questions for **loogle-MR-mixup** and **loogle-CR-mixup**, respectively.
- **dureader-mixup** is built from the [DuReader](https://github.com/baidu/DuReader) dataset. We first randomly select 200 instances and then manually remove 24 samples whose answers are longer than 360 words.
### Single-hop QA
In a single-hop QA task, only a single evidence in the context is needed to derive the answer.
- **loogle-SD-mixup** contains 160 unique QA pairs and 800 documents originated from the short-dependency QA task in [LooGLE](https://huggingface.co/datasets/bigainlco/LooGLE).
- **cmrc-mixup** is derived from the [CMRC 2018 Public Datasets](https://github.com/ymcui/cmrc2018), designed for Chinese machine reading comprehension. It contains ~20k questions annotated on Wikipedia paragraphs by human experts. We manually pick 200 QA pairs and their corresponding documents as supporting QA pairs and paragraphs.
- **multifieldqa-en-mixup** and **multifieldqa-zh-mixup** are built from the MultiFieldQA datasets in [LongBench](https://huggingface.co/datasets/THUDM/LongBench). We manually remove questions that can be answered using common-sense knowledge without referring to the context, and eventually get 101 and 133 unique QA pairs for **multifieldqa-en-mixup** and **multifieldqa-zh-mixup**, respectively.
- **factrecall-en** and **factrecall-zh** are two synthetic datasets designed to assess the LLMs' ability to identify a small piece of evidence (“fact”) located at various locations within a very lengthy context. We write one English fact-question-answer pair for **factrecall-en** and one Chinese fact-question-answer pair for **factrecall-zh**. Distracting documents are sourced from *PG-19* dataset (English) and the book of *Dream of the Red Chamber* (Chinese) to create five contexts of different length levels. For each context, we generate 200 documents by inserting the fact at 200 evenly spaced positions within the context.
# License
In **LV-Eval**, the cmrc-mixup and lic-mixup datasets follow `CC-BY-SA-4.0` license, and the other datasets follow `MIT` license.
# Citation
```
@misc{yuan2024lveval,
title={LV-Eval: A Balanced Long-Context Benchmark with 5 Length Levels Up to 256K},
author={Tao Yuan and Xuefei Ning and Dong Zhou and Zhijie Yang and Shiyao Li and Minghui Zhuang and Zheyue Tan and Zhuyu Yao and Dahua Lin and Boxun Li and Guohao Dai and Shengen Yan and Yu Wang},
year={2024},
eprint={2402.05136},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
huggingartists/nicki-minaj | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/nicki-minaj"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 2.01836 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/8ae5a5e5e030cb67814165bd038af48f.1000x1000x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/nicki-minaj">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Nicki Minaj</div>
<a href="https://genius.com/artists/nicki-minaj">
<div style="text-align: center; font-size: 14px;">@nicki-minaj</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/nicki-minaj).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/nicki-minaj")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|847| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/nicki-minaj")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
open-llm-leaderboard/details_maldv__electric-mist-7b | ---
pretty_name: Evaluation run of maldv/electric-mist-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [maldv/electric-mist-7b](https://huggingface.co/maldv/electric-mist-7b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_maldv__electric-mist-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-29T20:22:45.414201](https://huggingface.co/datasets/open-llm-leaderboard/details_maldv__electric-mist-7b/blob/main/results_2024-03-29T20-22-45.414201.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5931441068496816,\n\
\ \"acc_stderr\": 0.03313019969813812,\n \"acc_norm\": 0.6011893533111391,\n\
\ \"acc_norm_stderr\": 0.033818919029840425,\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.016238065069059608,\n \"mc2\": 0.453710740888472,\n\
\ \"mc2_stderr\": 0.014893424963710102\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5699658703071673,\n \"acc_stderr\": 0.014467631559137998,\n\
\ \"acc_norm\": 0.6117747440273038,\n \"acc_norm_stderr\": 0.014241614207414044\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6289583748257319,\n\
\ \"acc_stderr\": 0.004820962855749743,\n \"acc_norm\": 0.8256323441545509,\n\
\ \"acc_norm_stderr\": 0.0037864988567691206\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.029647813539365245,\n\
\ \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.029647813539365245\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.03724249595817729,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.03724249595817729\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4978723404255319,\n \"acc_stderr\": 0.03268572658667492,\n\
\ \"acc_norm\": 0.4978723404255319,\n \"acc_norm_stderr\": 0.03268572658667492\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520193,\n \"\
acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520193\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7096774193548387,\n\
\ \"acc_stderr\": 0.02582210611941591,\n \"acc_norm\": 0.7096774193548387,\n\
\ \"acc_norm_stderr\": 0.02582210611941591\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533086,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533086\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397443,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5743589743589743,\n \"acc_stderr\": 0.025069094387296532,\n\
\ \"acc_norm\": 0.5743589743589743,\n \"acc_norm_stderr\": 0.025069094387296532\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5798319327731093,\n \"acc_stderr\": 0.03206183783236152,\n \
\ \"acc_norm\": 0.5798319327731093,\n \"acc_norm_stderr\": 0.03206183783236152\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7798165137614679,\n \"acc_stderr\": 0.01776597865232756,\n \"\
acc_norm\": 0.7798165137614679,\n \"acc_norm_stderr\": 0.01776597865232756\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4212962962962963,\n \"acc_stderr\": 0.03367462138896079,\n \"\
acc_norm\": 0.4212962962962963,\n \"acc_norm_stderr\": 0.03367462138896079\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251735,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251735\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460302,\n \
\ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460302\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n\
\ \"acc_stderr\": 0.03266842214289202,\n \"acc_norm\": 0.6143497757847534,\n\
\ \"acc_norm_stderr\": 0.03266842214289202\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.040393149787245605,\n\
\ \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.040393149787245605\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591205,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591205\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.03642914578292406,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.03642914578292406\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.0398913985953177,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.0398913985953177\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8162393162393162,\n\
\ \"acc_stderr\": 0.025372139671722933,\n \"acc_norm\": 0.8162393162393162,\n\
\ \"acc_norm_stderr\": 0.025372139671722933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7828863346104725,\n\
\ \"acc_stderr\": 0.014743125394823297,\n \"acc_norm\": 0.7828863346104725,\n\
\ \"acc_norm_stderr\": 0.014743125394823297\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.025906632631016124,\n\
\ \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.025906632631016124\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4044692737430168,\n\
\ \"acc_stderr\": 0.01641444091729315,\n \"acc_norm\": 0.4044692737430168,\n\
\ \"acc_norm_stderr\": 0.01641444091729315\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.026716118380156847,\n\
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.026716118380156847\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.02540383297817961,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.02540383297817961\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.026462487777001865,\n\
\ \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.026462487777001865\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4198174706649283,\n\
\ \"acc_stderr\": 0.012604960816087384,\n \"acc_norm\": 0.4198174706649283,\n\
\ \"acc_norm_stderr\": 0.012604960816087384\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n\
\ \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6290849673202614,\n \"acc_stderr\": 0.019542101564854125,\n \
\ \"acc_norm\": 0.6290849673202614,\n \"acc_norm_stderr\": 0.019542101564854125\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.029719329422417475,\n\
\ \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.029719329422417475\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7860696517412935,\n\
\ \"acc_stderr\": 0.028996909693328916,\n \"acc_norm\": 0.7860696517412935,\n\
\ \"acc_norm_stderr\": 0.028996909693328916\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.032659863237109066,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.032659863237109066\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533193,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533193\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.016238065069059608,\n \"mc2\": 0.453710740888472,\n\
\ \"mc2_stderr\": 0.014893424963710102\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7150749802683505,\n \"acc_stderr\": 0.012685986125141227\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2350265352539803,\n \
\ \"acc_stderr\": 0.011679491349994874\n }\n}\n```"
repo_url: https://huggingface.co/maldv/electric-mist-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|arc:challenge|25_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|gsm8k|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hellaswag|10_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-22-45.414201.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T20-22-45.414201.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- '**/details_harness|winogrande|5_2024-03-29T20-22-45.414201.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-29T20-22-45.414201.parquet'
- config_name: results
data_files:
- split: 2024_03_29T20_22_45.414201
path:
- results_2024-03-29T20-22-45.414201.parquet
- split: latest
path:
- results_2024-03-29T20-22-45.414201.parquet
---
# Dataset Card for Evaluation run of maldv/electric-mist-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [maldv/electric-mist-7b](https://huggingface.co/maldv/electric-mist-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_maldv__electric-mist-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-29T20:22:45.414201](https://huggingface.co/datasets/open-llm-leaderboard/details_maldv__electric-mist-7b/blob/main/results_2024-03-29T20-22-45.414201.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5931441068496816,
"acc_stderr": 0.03313019969813812,
"acc_norm": 0.6011893533111391,
"acc_norm_stderr": 0.033818919029840425,
"mc1": 0.31334149326805383,
"mc1_stderr": 0.016238065069059608,
"mc2": 0.453710740888472,
"mc2_stderr": 0.014893424963710102
},
"harness|arc:challenge|25": {
"acc": 0.5699658703071673,
"acc_stderr": 0.014467631559137998,
"acc_norm": 0.6117747440273038,
"acc_norm_stderr": 0.014241614207414044
},
"harness|hellaswag|10": {
"acc": 0.6289583748257319,
"acc_stderr": 0.004820962855749743,
"acc_norm": 0.8256323441545509,
"acc_norm_stderr": 0.0037864988567691206
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6339622641509434,
"acc_stderr": 0.029647813539365245,
"acc_norm": 0.6339622641509434,
"acc_norm_stderr": 0.029647813539365245
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.03724249595817729,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.03724249595817729
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4978723404255319,
"acc_stderr": 0.03268572658667492,
"acc_norm": 0.4978723404255319,
"acc_norm_stderr": 0.03268572658667492
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520193,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520193
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7096774193548387,
"acc_stderr": 0.02582210611941591,
"acc_norm": 0.7096774193548387,
"acc_norm_stderr": 0.02582210611941591
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.03135305009533086,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.03135305009533086
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.026499057701397443,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.026499057701397443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5743589743589743,
"acc_stderr": 0.025069094387296532,
"acc_norm": 0.5743589743589743,
"acc_norm_stderr": 0.025069094387296532
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5798319327731093,
"acc_stderr": 0.03206183783236152,
"acc_norm": 0.5798319327731093,
"acc_norm_stderr": 0.03206183783236152
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7798165137614679,
"acc_stderr": 0.01776597865232756,
"acc_norm": 0.7798165137614679,
"acc_norm_stderr": 0.01776597865232756
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4212962962962963,
"acc_stderr": 0.03367462138896079,
"acc_norm": 0.4212962962962963,
"acc_norm_stderr": 0.03367462138896079
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251735,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251735
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.028458820991460302,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.028458820991460302
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289202,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289202
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.040393149787245605,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.040393149787245605
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591205,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591205
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.03642914578292406,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.03642914578292406
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.0398913985953177,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.0398913985953177
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8162393162393162,
"acc_stderr": 0.025372139671722933,
"acc_norm": 0.8162393162393162,
"acc_norm_stderr": 0.025372139671722933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7828863346104725,
"acc_stderr": 0.014743125394823297,
"acc_norm": 0.7828863346104725,
"acc_norm_stderr": 0.014743125394823297
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.025906632631016124,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.025906632631016124
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4044692737430168,
"acc_stderr": 0.01641444091729315,
"acc_norm": 0.4044692737430168,
"acc_norm_stderr": 0.01641444091729315
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.026716118380156847,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.026716118380156847
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.02540383297817961,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.02540383297817961
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.026462487777001865,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.026462487777001865
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424438,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424438
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4198174706649283,
"acc_stderr": 0.012604960816087384,
"acc_norm": 0.4198174706649283,
"acc_norm_stderr": 0.012604960816087384
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6290849673202614,
"acc_stderr": 0.019542101564854125,
"acc_norm": 0.6290849673202614,
"acc_norm_stderr": 0.019542101564854125
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.029719329422417475,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.029719329422417475
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7860696517412935,
"acc_stderr": 0.028996909693328916,
"acc_norm": 0.7860696517412935,
"acc_norm_stderr": 0.028996909693328916
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.032659863237109066,
"acc_norm": 0.88,
"acc_norm_stderr": 0.032659863237109066
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.030944459778533193,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.030944459778533193
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31334149326805383,
"mc1_stderr": 0.016238065069059608,
"mc2": 0.453710740888472,
"mc2_stderr": 0.014893424963710102
},
"harness|winogrande|5": {
"acc": 0.7150749802683505,
"acc_stderr": 0.012685986125141227
},
"harness|gsm8k|5": {
"acc": 0.2350265352539803,
"acc_stderr": 0.011679491349994874
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Deathspike/magical-girl-lyrical-nanoha-movie-2nd | ---
license: cc-by-nc-sa-4.0
---
|
distilled-one-sec-cv12-each-chunk-uniq/chunk_4 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 924478480.0
num_examples: 180140
download_size: 944999652
dataset_size: 924478480.0
---
# Dataset Card for "chunk_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RuhamaKhan/youtube_parsed_dataset | ---
license: openrail
---
|
stepkurniawan/qa_sustainability_wiki | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: ground_truths
dtype: string
splits:
- name: train
num_bytes: 195625.12855377008
num_examples: 647
- name: test
num_bytes: 48981.87144622991
num_examples: 162
download_size: 149066
dataset_size: 244607.0
---
The purpose of this dataset is to have a question - answer (ground truth) in a table format. The question and answer are all created by using langchain x gpt-4
since it will take a long time for me to create it manually. However, as a due diligent, I have checked randomly more than 50% of the questions and answers,
and judged that it is safe to use.
The source of this questions and answer is from a private wiki page called Sustainable Methods Wiki, created by Prof. Henrik .v. Wahrden.
Link: https://sustainabilitymethods.org/index.php/Main_Page
|
masonlf/Ergoscript | ---
license: mit
---
|
open-llm-leaderboard/details_Phind__Phind-CodeLlama-34B-v1 | ---
pretty_name: Evaluation run of Phind/Phind-CodeLlama-34B-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Phind/Phind-CodeLlama-34B-v1](https://huggingface.co/Phind/Phind-CodeLlama-34B-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Phind__Phind-CodeLlama-34B-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T17:56:04.803454](https://huggingface.co/datasets/open-llm-leaderboard/details_Phind__Phind-CodeLlama-34B-v1/blob/main/results_2023-09-17T17-56-04.803454.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3409186241610738,\n\
\ \"em_stderr\": 0.004854388549221253,\n \"f1\": 0.3901226929530212,\n\
\ \"f1_stderr\": 0.004753426310613145,\n \"acc\": 0.46541261736516804,\n\
\ \"acc_stderr\": 0.01182360456434163\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.3409186241610738,\n \"em_stderr\": 0.004854388549221253,\n\
\ \"f1\": 0.3901226929530212,\n \"f1_stderr\": 0.004753426310613145\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2047005307050796,\n \
\ \"acc_stderr\": 0.011113916396062963\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7261247040252565,\n \"acc_stderr\": 0.0125332927326203\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Phind/Phind-CodeLlama-34B-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|arc:challenge|25_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T17_56_04.803454
path:
- '**/details_harness|drop|3_2023-09-17T17-56-04.803454.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T17-56-04.803454.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T17_56_04.803454
path:
- '**/details_harness|gsm8k|5_2023-09-17T17-56-04.803454.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T17-56-04.803454.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hellaswag|10_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T17_56_04.803454
path:
- '**/details_harness|winogrande|5_2023-09-17T17-56-04.803454.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T17-56-04.803454.parquet'
- config_name: results
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- results_2023-08-26T05:41:49.471462.parquet
- split: 2023_09_17T17_56_04.803454
path:
- results_2023-09-17T17-56-04.803454.parquet
- split: latest
path:
- results_2023-09-17T17-56-04.803454.parquet
---
# Dataset Card for Evaluation run of Phind/Phind-CodeLlama-34B-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Phind/Phind-CodeLlama-34B-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Phind/Phind-CodeLlama-34B-v1](https://huggingface.co/Phind/Phind-CodeLlama-34B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Phind__Phind-CodeLlama-34B-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T17:56:04.803454](https://huggingface.co/datasets/open-llm-leaderboard/details_Phind__Phind-CodeLlama-34B-v1/blob/main/results_2023-09-17T17-56-04.803454.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.3409186241610738,
"em_stderr": 0.004854388549221253,
"f1": 0.3901226929530212,
"f1_stderr": 0.004753426310613145,
"acc": 0.46541261736516804,
"acc_stderr": 0.01182360456434163
},
"harness|drop|3": {
"em": 0.3409186241610738,
"em_stderr": 0.004854388549221253,
"f1": 0.3901226929530212,
"f1_stderr": 0.004753426310613145
},
"harness|gsm8k|5": {
"acc": 0.2047005307050796,
"acc_stderr": 0.011113916396062963
},
"harness|winogrande|5": {
"acc": 0.7261247040252565,
"acc_stderr": 0.0125332927326203
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
W1lson/RMData | ---
dataset_info:
features:
- name: Source ID
dtype: int64
- name: Primary Text
dtype: string
- name: Artifact Type
dtype: string
- name: Design Package
dtype: string
- name: Location
dtype: string
- name: Verification Method
dtype: string
- name: Validation Method
dtype: string
splits:
- name: train
num_bytes: 6326
num_examples: 35
download_size: 7719
dataset_size: 6326
---
# Dataset Card for "RMData"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/419_People_Colorful_Living_Face_Anti_Spoofing_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
419 People–Colorful Living_Face & Anti_Spoofing Data. The collection scenes include indoor and outdoor scenes. The data includes males and females. The age distribution ranges from juvenile to the elderly, the young people and the middle aged are the majorities. Devices include cellphone and Pad. The data includes various devices, various anti-spoofing samples, multiple light conditions, multiple scenes. The data can be used for tasks such as colorful remote ID authentication, and living_face & anti_spoofing.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1217?source=Huggingface
## Data size
419 people, 11 people were 3D head models or 3D facial masks data, which were collected by 2-3 people wearing masks, and no subsequent data statistics were conducted
## Population distribution
Race distribution: Asian; Gender distribution: 204 males, 204 females; Age distribution: 40 people under 18 years old, 258 people aged from 18 to 45, 73 people aged from 46 to 60, 37 people over 60 years old
## Collecting environment
248 people in indoor scenes, 160 people in outdoor scenes
## Data diversity
various devices, various anti-spoofing samples, multiple light conditions, multiple scenes
## Device
cellphone, Pad
## Data format
.mp4
## Annotation content:
label the person – ID, race, gender, age, collecting scene, glasses state, light condition
## Accuracy
based on the accuracy of the actions, the accuracy exceeds 97%; the accuracy of label annotation is not less than 97%
# Licensing Information
Commercial License
|
Wangchunshu/VLUE | ---
license: afl-3.0
---
|
yuvalkirstain/task_prediction_train3 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: path
dtype: string
- name: text
dtype: string
- name: task_name
dtype: string
splits:
- name: train
num_bytes: 659890949
num_examples: 5663600
- name: validation
num_bytes: 7823929
num_examples: 60002
- name: test
num_bytes: 153998
num_examples: 2057
download_size: 148209849
dataset_size: 667868876
---
# Dataset Card for "task_prediction_train3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
iwaaaaa/coquito | ---
license: artistic-2.0
---
|
CyberHarem/illyasviel_von_einzbern_fatestaynightufotable | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Illyasviel Von Einzbern (Fate Stay Night [UFOTABLE])
This is the dataset of Illyasviel Von Einzbern (Fate Stay Night [UFOTABLE]), containing 95 images and their tags.
The core tags of this character are `long_hair, white_hair, red_eyes, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 95 | 88.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/illyasviel_von_einzbern_fatestaynightufotable/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 95 | 88.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/illyasviel_von_einzbern_fatestaynightufotable/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 221 | 178.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/illyasviel_von_einzbern_fatestaynightufotable/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/illyasviel_von_einzbern_fatestaynightufotable',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, ascot, closed_mouth, purple_shirt, solo, upper_body, anime_coloring, frown, looking_at_viewer, long_sleeves |
| 1 | 13 |  |  |  |  |  | 1girl, long_sleeves, purple_shirt, solo, white_skirt, ascot, anime_coloring, closed_mouth |
| 2 | 25 |  |  |  |  |  | 1girl, papakha, white_scarf, purple_headwear, coat, solo, anime_coloring, closed_mouth, looking_at_viewer, smile, outdoors, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | ascot | closed_mouth | purple_shirt | solo | upper_body | anime_coloring | frown | looking_at_viewer | long_sleeves | white_skirt | papakha | white_scarf | purple_headwear | coat | smile | outdoors |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:---------------|:---------------|:-------|:-------------|:-----------------|:--------|:--------------------|:---------------|:--------------|:----------|:--------------|:------------------|:-------|:--------|:-----------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 1 | 13 |  |  |  |  |  | X | X | X | X | X | | X | | | X | X | | | | | | |
| 2 | 25 |  |  |  |  |  | X | | X | | X | X | X | | X | | | X | X | X | X | X | X |
|
NimbusTheOne/GoogleMusic1 | ---
license: openrail
task_categories:
- text-classification
language:
- en
tags:
- music
pretty_name: GM1
size_categories:
- 100K<n<1M
--- |
open-llm-leaderboard/details_mvpmaster__pmmpk-EinstainMorcoro14KrishnaHercules-7b-slerp | ---
pretty_name: Evaluation run of mvpmaster/pmmpk-EinstainMorcoro14KrishnaHercules-7b-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mvpmaster/pmmpk-EinstainMorcoro14KrishnaHercules-7b-slerp](https://huggingface.co/mvpmaster/pmmpk-EinstainMorcoro14KrishnaHercules-7b-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mvpmaster__pmmpk-EinstainMorcoro14KrishnaHercules-7b-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-21T21:07:02.623911](https://huggingface.co/datasets/open-llm-leaderboard/details_mvpmaster__pmmpk-EinstainMorcoro14KrishnaHercules-7b-slerp/blob/main/results_2024-03-21T21-07-02.623911.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6555154124082347,\n\
\ \"acc_stderr\": 0.031953975895505166,\n \"acc_norm\": 0.6556857924606263,\n\
\ \"acc_norm_stderr\": 0.0326111399873225,\n \"mc1\": 0.4565483476132191,\n\
\ \"mc1_stderr\": 0.01743728095318369,\n \"mc2\": 0.6269217532109524,\n\
\ \"mc2_stderr\": 0.015229668754636253\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6578498293515358,\n \"acc_stderr\": 0.013864152159177275,\n\
\ \"acc_norm\": 0.6928327645051194,\n \"acc_norm_stderr\": 0.013481034054980941\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6786496713802032,\n\
\ \"acc_stderr\": 0.004660405565338758,\n \"acc_norm\": 0.8658633738299144,\n\
\ \"acc_norm_stderr\": 0.00340102551787373\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368881,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368881\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03583496176361073,\n\
\ \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03583496176361073\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.04951218252396262,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.04951218252396262\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6042553191489362,\n \"acc_stderr\": 0.031967586978353627,\n\
\ \"acc_norm\": 0.6042553191489362,\n \"acc_norm_stderr\": 0.031967586978353627\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894444,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894444\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n\
\ \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721175,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721175\n \
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8532110091743119,\n \"acc_stderr\": 0.015173141845126243,\n \"\
acc_norm\": 0.8532110091743119,\n \"acc_norm_stderr\": 0.015173141845126243\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5462962962962963,\n \"acc_stderr\": 0.03395322726375797,\n \"\
acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.03395322726375797\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8627450980392157,\n \"acc_stderr\": 0.02415222596280158,\n \"\
acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.02415222596280158\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137296,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137296\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\
\ \"acc_stderr\": 0.013428186370608315,\n \"acc_norm\": 0.8301404853128991,\n\
\ \"acc_norm_stderr\": 0.013428186370608315\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4324022346368715,\n\
\ \"acc_stderr\": 0.016568971233548606,\n \"acc_norm\": 0.4324022346368715,\n\
\ \"acc_norm_stderr\": 0.016568971233548606\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422466,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422466\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44784876140808344,\n\
\ \"acc_stderr\": 0.012700582404768221,\n \"acc_norm\": 0.44784876140808344,\n\
\ \"acc_norm_stderr\": 0.012700582404768221\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7022058823529411,\n \"acc_stderr\": 0.027778298701545436,\n\
\ \"acc_norm\": 0.7022058823529411,\n \"acc_norm_stderr\": 0.027778298701545436\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \
\ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827072,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827072\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4565483476132191,\n\
\ \"mc1_stderr\": 0.01743728095318369,\n \"mc2\": 0.6269217532109524,\n\
\ \"mc2_stderr\": 0.015229668754636253\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8089976322020521,\n \"acc_stderr\": 0.011047808761510432\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.714177407126611,\n \
\ \"acc_stderr\": 0.012444963460615624\n }\n}\n```"
repo_url: https://huggingface.co/mvpmaster/pmmpk-EinstainMorcoro14KrishnaHercules-7b-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|arc:challenge|25_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|gsm8k|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hellaswag|10_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T21-07-02.623911.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T21-07-02.623911.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- '**/details_harness|winogrande|5_2024-03-21T21-07-02.623911.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-21T21-07-02.623911.parquet'
- config_name: results
data_files:
- split: 2024_03_21T21_07_02.623911
path:
- results_2024-03-21T21-07-02.623911.parquet
- split: latest
path:
- results_2024-03-21T21-07-02.623911.parquet
---
# Dataset Card for Evaluation run of mvpmaster/pmmpk-EinstainMorcoro14KrishnaHercules-7b-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mvpmaster/pmmpk-EinstainMorcoro14KrishnaHercules-7b-slerp](https://huggingface.co/mvpmaster/pmmpk-EinstainMorcoro14KrishnaHercules-7b-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mvpmaster__pmmpk-EinstainMorcoro14KrishnaHercules-7b-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-21T21:07:02.623911](https://huggingface.co/datasets/open-llm-leaderboard/details_mvpmaster__pmmpk-EinstainMorcoro14KrishnaHercules-7b-slerp/blob/main/results_2024-03-21T21-07-02.623911.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6555154124082347,
"acc_stderr": 0.031953975895505166,
"acc_norm": 0.6556857924606263,
"acc_norm_stderr": 0.0326111399873225,
"mc1": 0.4565483476132191,
"mc1_stderr": 0.01743728095318369,
"mc2": 0.6269217532109524,
"mc2_stderr": 0.015229668754636253
},
"harness|arc:challenge|25": {
"acc": 0.6578498293515358,
"acc_stderr": 0.013864152159177275,
"acc_norm": 0.6928327645051194,
"acc_norm_stderr": 0.013481034054980941
},
"harness|hellaswag|10": {
"acc": 0.6786496713802032,
"acc_stderr": 0.004660405565338758,
"acc_norm": 0.8658633738299144,
"acc_norm_stderr": 0.00340102551787373
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368881,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368881
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03583496176361073,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03583496176361073
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.04951218252396262,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.04951218252396262
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6042553191489362,
"acc_stderr": 0.031967586978353627,
"acc_norm": 0.6042553191489362,
"acc_norm_stderr": 0.031967586978353627
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894444,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894444
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.031234752377721175,
"acc_norm": 0.8,
"acc_norm_stderr": 0.031234752377721175
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8532110091743119,
"acc_stderr": 0.015173141845126243,
"acc_norm": 0.8532110091743119,
"acc_norm_stderr": 0.015173141845126243
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.03395322726375797,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.03395322726375797
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.02415222596280158,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.02415222596280158
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233494,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.036412970813137296,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.036412970813137296
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608315,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608315
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4324022346368715,
"acc_stderr": 0.016568971233548606,
"acc_norm": 0.4324022346368715,
"acc_norm_stderr": 0.016568971233548606
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422466,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422466
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44784876140808344,
"acc_stderr": 0.012700582404768221,
"acc_norm": 0.44784876140808344,
"acc_norm_stderr": 0.012700582404768221
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7022058823529411,
"acc_stderr": 0.027778298701545436,
"acc_norm": 0.7022058823529411,
"acc_norm_stderr": 0.027778298701545436
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.01904748523936038,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.01904748523936038
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827072,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827072
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4565483476132191,
"mc1_stderr": 0.01743728095318369,
"mc2": 0.6269217532109524,
"mc2_stderr": 0.015229668754636253
},
"harness|winogrande|5": {
"acc": 0.8089976322020521,
"acc_stderr": 0.011047808761510432
},
"harness|gsm8k|5": {
"acc": 0.714177407126611,
"acc_stderr": 0.012444963460615624
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Shawn0069/resume_classification_kaggle | ---
dataset_info:
features:
- name: ID
dtype: int64
- name: Resume_str
dtype: string
- name: Resume_html
dtype: string
- name: Category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 43644580
num_examples: 1987
- name: test
num_bytes: 11175285
num_examples: 497
- name: validation
num_bytes: 11175285
num_examples: 497
download_size: 24410997
dataset_size: 65995150
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
mkashani-phd/BLE_WBAN | ---
license: mit
---
|
WilliamWen/activity_datasets | ---
license: apache-2.0
task_categories:
- token-classification
language:
- en
--- |
TrainThenObtain-ai/Jarvis-tiny | ---
license: creativeml-openrail-m
---
|
maxolotl/must-c-en-fr-wait07_21.8 | ---
dataset_info:
features:
- name: current_source
dtype: string
- name: current_target
dtype: string
- name: target_token
dtype: string
splits:
- name: train
num_bytes: 1140444988
num_examples: 5459617
- name: test
num_bytes: 12622881
num_examples: 63342
- name: validation
num_bytes: 5965971
num_examples: 28830
download_size: 181926664
dataset_size: 1159033840
---
# Dataset Card for "must-c-en-fr-wait07_21.8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MITCriticalData/Unlabeled_top_10_cities_forward_backward_alg | ---
license: mit
---
|
ophycare/chatdoctor-dataset | ---
license: llama2
---
|
louisbrulenaudet/code-cinema-image-animee | ---
license: apache-2.0
language:
- fr
multilinguality:
- monolingual
tags:
- finetuning
- legal
- french law
- droit français
- Code du cinéma et de l'image animée
source_datasets:
- original
pretty_name: Code du cinéma et de l'image animée
task_categories:
- text-generation
- table-question-answering
- summarization
- text-retrieval
- question-answering
- text-classification
size_categories:
- 1K<n<10K
---
# Code du cinéma et de l'image animée, non-instruct (2024-04-15)
This project focuses on fine-tuning pre-trained language models to create efficient and accurate models for legal practice.
Fine-tuning is the process of adapting a pre-trained model to perform specific tasks or cater to particular domains. It involves adjusting the model's parameters through a further round of training on task-specific or domain-specific data. While conventional fine-tuning strategies involve supervised learning with labeled data, instruction-based fine-tuning introduces a more structured and interpretable approach.
Instruction-based fine-tuning leverages the power of human-provided instructions to guide the model's behavior. These instructions can be in the form of text prompts, prompts with explicit task descriptions, or a combination of both. This approach allows for a more controlled and context-aware interaction with the LLM, making it adaptable to a multitude of specialized tasks.
Instruction-based fine-tuning significantly enhances the performance of LLMs in the following ways:
- Task-Specific Adaptation: LLMs, when fine-tuned with specific instructions, exhibit remarkable adaptability to diverse tasks. They can switch seamlessly between translation, summarization, and question-answering, guided by the provided instructions.
- Reduced Ambiguity: Traditional LLMs might generate ambiguous or contextually inappropriate responses. Instruction-based fine-tuning allows for a clearer and more context-aware generation, reducing the likelihood of nonsensical outputs.
- Efficient Knowledge Transfer: Instructions can encapsulate domain-specific knowledge, enabling LLMs to benefit from expert guidance. This knowledge transfer is particularly valuable in fields like tax practice, law, medicine, and more.
- Interpretability: Instruction-based fine-tuning also makes LLM behavior more interpretable. Since the instructions are human-readable, it becomes easier to understand and control model outputs.
- Adaptive Behavior: LLMs, post instruction-based fine-tuning, exhibit adaptive behavior that is responsive to both explicit task descriptions and implicit cues within the provided text.
## Concurrent reading of the LegalKit
To use all the legal data published on LegalKit, you can use this code snippet:
```python
# -*- coding: utf-8 -*-
import concurrent.futures
import os
import datasets
from tqdm.notebook import tqdm
def dataset_loader(
name:str,
streaming:bool=True
) -> datasets.Dataset:
"""
Helper function to load a single dataset in parallel.
Parameters
----------
name : str
Name of the dataset to be loaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
dataset : datasets.Dataset
Loaded dataset object.
Raises
------
Exception
If an error occurs during dataset loading.
"""
try:
return datasets.load_dataset(
name,
split="train",
streaming=streaming
)
except Exception as exc:
logging.error(f"Error loading dataset {name}: {exc}")
return None
def load_datasets(
req:list,
streaming:bool=True
) -> list:
"""
Downloads datasets specified in a list and creates a list of loaded datasets.
Parameters
----------
req : list
A list containing the names of datasets to be downloaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
datasets_list : list
A list containing loaded datasets as per the requested names provided in 'req'.
Raises
------
Exception
If an error occurs during dataset loading or processing.
Examples
--------
>>> datasets = load_datasets(["dataset1", "dataset2"], streaming=False)
"""
datasets_list = []
with concurrent.futures.ThreadPoolExecutor() as executor:
future_to_dataset = {executor.submit(dataset_loader, name): name for name in req}
for future in tqdm(concurrent.futures.as_completed(future_to_dataset), total=len(req)):
name = future_to_dataset[future]
try:
dataset = future.result()
if dataset:
datasets_list.append(dataset)
except Exception as exc:
logging.error(f"Error processing dataset {name}: {exc}")
return datasets_list
req = [
"louisbrulenaudet/code-artisanat",
"louisbrulenaudet/code-action-sociale-familles",
# ...
]
datasets_list = load_datasets(
req=req,
streaming=True
)
dataset = datasets.concatenate_datasets(
datasets_list
)
```
## Dataset generation
This JSON file is a list of dictionaries, each dictionary contains the following fields:
- `instruction`: `string`, presenting the instruction linked to the element.
- `input`: `string`, signifying the input details for the element.
- `output`: `string`, indicating the output information for the element.
- `start`: `string`, the date of entry into force of the article.
- `expiration`: `string`, the date of expiration of the article.
- `num`: `string`, the id of the article.
We used the following list of instructions for generating the dataset:
```python
instructions = [
"Compose l'intégralité de l'article sous forme écrite.",
"Écris la totalité du contenu de l'article.",
"Formule la totalité du texte présent dans l'article.",
"Produis l'intégralité de l'article en écriture.",
"Développe l'article dans son ensemble par écrit.",
"Génère l'ensemble du texte contenu dans l'article.",
"Formule le contenu intégral de l'article en entier.",
"Rédige la totalité du texte de l'article en entier.",
"Compose l'intégralité du contenu textuel de l'article.",
"Rédige l'ensemble du texte qui constitue l'article.",
"Formule l'article entier dans son contenu écrit.",
"Composez l'intégralité de l'article sous forme écrite.",
"Écrivez la totalité du contenu de l'article.",
"Formulez la totalité du texte présent dans l'article.",
"Développez l'article dans son ensemble par écrit.",
"Générez l'ensemble du texte contenu dans l'article.",
"Formulez le contenu intégral de l'article en entier.",
"Rédigez la totalité du texte de l'article en entier.",
"Composez l'intégralité du contenu textuel de l'article.",
"Écrivez l'article dans son intégralité en termes de texte.",
"Rédigez l'ensemble du texte qui constitue l'article.",
"Formulez l'article entier dans son contenu écrit.",
"Composer l'intégralité de l'article sous forme écrite.",
"Écrire la totalité du contenu de l'article.",
"Formuler la totalité du texte présent dans l'article.",
"Produire l'intégralité de l'article en écriture.",
"Développer l'article dans son ensemble par écrit.",
"Générer l'ensemble du texte contenu dans l'article.",
"Formuler le contenu intégral de l'article en entier.",
"Rédiger la totalité du texte de l'article en entier.",
"Composer l'intégralité du contenu textuel de l'article.",
"Rédiger l'ensemble du texte qui constitue l'article.",
"Formuler l'article entier dans son contenu écrit.",
"Quelles sont les dispositions de l'article ?",
"Quelles dispositions sont incluses dans l'article ?",
"Quelles sont les dispositions énoncées dans l'article ?",
"Quel est le texte intégral de l'article ?",
"Quelle est la lettre de l'article ?"
]
```
## Feedback
If you have any feedback, please reach out at [louisbrulenaudet@icloud.com](mailto:louisbrulenaudet@icloud.com). |
jxm/cr | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: dev
path: data/dev-*
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 192172
num_examples: 1775
- name: test
num_bytes: 219871
num_examples: 2000
- name: dev
num_bytes: 29232
num_examples: 256
download_size: 253672
dataset_size: 441275
---
# Dataset Card for "cr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zelros/pj-da | ---
tags:
- insurance
---
This dataset contains question/answer pairs from a French legal protection insurance (https://www.service-public.fr/particuliers/vosdroits/F3049?lang=en).
The objective of this dataset is to contribute to open source research projects aiming to, for instance:
* fine-tune LLMs on high-quality datasets, specializing them in the insurance domain
* develop new question/answer applications using Retrieval Augmented Generation (RAG) for insurance contracts
* assess the knowledge of language models in the insurance field
* more generally, apply LLMs to the insurance domain for better understanding and increased transparency of this industry.
Other datasets of the same kind are also available - or will be available soon - and are part of this research effort. See here: https://huggingface.co/collections/zelros/legal-protection-insurance-6536e8f389dd48faca78447e
Here is an example of usages of this dataset: https://huggingface.co/spaces/zelros/The-legal-protection-insurance-comparator |
saaadh/alpaca_hw_dataset | ---
license: llama2
---
|
Seenka/banners-Canal_13_AR-20230628T190000-20230628T200000 | ---
dataset_info:
features:
- name: image
dtype: image
- name: timestamp
dtype: timestamp[ms, tz=America/Argentina/Buenos_Aires]
- name: video_storage_path
dtype: string
- name: timedelta
dtype: time64[us]
- name: yolo_seenka_out
list:
- name: class
dtype: int64
- name: confidence
dtype: float64
- name: name
dtype: string
- name: xmax
dtype: float64
- name: xmin
dtype: float64
- name: ymax
dtype: float64
- name: ymin
dtype: float64
- name: yolo_filter_param
dtype: int64
- name: cropped_seenka_image
dtype: image
- name: embeddings_cropped
sequence: float32
- name: entropy
dtype: float64
- name: contrast
dtype: float64
splits:
- name: train
num_bytes: 342643851.5
num_examples: 3598
download_size: 341126878
dataset_size: 342643851.5
---
# Dataset Card for "banners-Canal_13_AR-20230628T190000-20230628T200000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_openchat__openchat_v2_w | ---
pretty_name: Evaluation run of openchat/openchat_v2_w
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [openchat/openchat_v2_w](https://huggingface.co/openchat/openchat_v2_w) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openchat__openchat_v2_w\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T10:16:39.894095](https://huggingface.co/datasets/open-llm-leaderboard/details_openchat__openchat_v2_w/blob/main/results_2023-10-25T10-16-39.894095.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0017827181208053692,\n\
\ \"em_stderr\": 0.0004320097346038692,\n \"f1\": 0.06345113255033595,\n\
\ \"f1_stderr\": 0.0013770461350277562,\n \"acc\": 0.4217142689595871,\n\
\ \"acc_stderr\": 0.009831291629413687\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0017827181208053692,\n \"em_stderr\": 0.0004320097346038692,\n\
\ \"f1\": 0.06345113255033595,\n \"f1_stderr\": 0.0013770461350277562\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0841546626231994,\n \
\ \"acc_stderr\": 0.007647024046603207\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7592738752959748,\n \"acc_stderr\": 0.012015559212224167\n\
\ }\n}\n```"
repo_url: https://huggingface.co/openchat/openchat_v2_w
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|arc:challenge|25_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|arc:challenge|25_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_19T04_55_59.182634
path:
- '**/details_harness|drop|3_2023-10-19T04-55-59.182634.parquet'
- split: 2023_10_25T10_16_39.894095
path:
- '**/details_harness|drop|3_2023-10-25T10-16-39.894095.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T10-16-39.894095.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_19T04_55_59.182634
path:
- '**/details_harness|gsm8k|5_2023-10-19T04-55-59.182634.parquet'
- split: 2023_10_25T10_16_39.894095
path:
- '**/details_harness|gsm8k|5_2023-10-25T10-16-39.894095.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T10-16-39.894095.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hellaswag|10_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hellaswag|10_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T16:07:10.180940.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T10:10:49.498602.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T16:07:10.180940.parquet'
- split: 2023_08_09T10_10_49.498602
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T10:10:49.498602.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T10:10:49.498602.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_19T04_55_59.182634
path:
- '**/details_harness|winogrande|5_2023-10-19T04-55-59.182634.parquet'
- split: 2023_10_25T10_16_39.894095
path:
- '**/details_harness|winogrande|5_2023-10-25T10-16-39.894095.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T10-16-39.894095.parquet'
- config_name: results
data_files:
- split: 2023_07_24T16_07_10.180940
path:
- results_2023-07-24T16:07:10.180940.parquet
- split: 2023_08_09T10_10_49.498602
path:
- results_2023-08-09T10:10:49.498602.parquet
- split: 2023_10_19T04_55_59.182634
path:
- results_2023-10-19T04-55-59.182634.parquet
- split: 2023_10_25T10_16_39.894095
path:
- results_2023-10-25T10-16-39.894095.parquet
- split: latest
path:
- results_2023-10-25T10-16-39.894095.parquet
---
# Dataset Card for Evaluation run of openchat/openchat_v2_w
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openchat/openchat_v2_w
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [openchat/openchat_v2_w](https://huggingface.co/openchat/openchat_v2_w) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openchat__openchat_v2_w",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T10:16:39.894095](https://huggingface.co/datasets/open-llm-leaderboard/details_openchat__openchat_v2_w/blob/main/results_2023-10-25T10-16-39.894095.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0017827181208053692,
"em_stderr": 0.0004320097346038692,
"f1": 0.06345113255033595,
"f1_stderr": 0.0013770461350277562,
"acc": 0.4217142689595871,
"acc_stderr": 0.009831291629413687
},
"harness|drop|3": {
"em": 0.0017827181208053692,
"em_stderr": 0.0004320097346038692,
"f1": 0.06345113255033595,
"f1_stderr": 0.0013770461350277562
},
"harness|gsm8k|5": {
"acc": 0.0841546626231994,
"acc_stderr": 0.007647024046603207
},
"harness|winogrande|5": {
"acc": 0.7592738752959748,
"acc_stderr": 0.012015559212224167
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
albertvillanova/carbon_24 | ---
annotations_creators:
- machine-generated
language_creators:
- machine-generated
language:
- cif
license:
- mit
multilinguality:
- other-crystallography
size_categories:
- unknown
source_datasets: []
task_categories:
- other
task_ids: []
pretty_name: Carbon-24
tags:
- material-property-optimization
- material-reconstruction
- material-generation
---
# Dataset Card for Carbon-24
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:** https://github.com/txie-93/cdvae/tree/main/data/carbon_24
- **Paper:** [Crystal Diffusion Variational Autoencoder for Periodic Material Generation](https://arxiv.org/abs/2110.06197)
- **Leaderboard:**
- **Point of Contact:** [Tian Xie](mailto:txie@csail.mit.edu)
### Dataset Summary
Carbon-24 contains 10k carbon materials, which share the same composition, but have different structures. There is 1 element and the materials have 6 - 24 atoms in the unit cells.
Carbon-24 includes various carbon structures obtained via ab initio random structure searching (AIRSS) (Pickard & Needs, 2006; 2011) performed at 10 GPa.
The original dataset includes 101529 carbon structures, and we selected the 10% of the carbon structure with the lowest energy per atom to create Carbon-24. All 10153 structures are at local energy minimum after DFT relaxation. The most stable structure is diamond at 10 GPa. All remaining structures are thermodynamically unstable but may be kinetically stable.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
Please consider citing the following papers:
```
@article{xie2021crystal,
title={Crystal Diffusion Variational Autoencoder for Periodic Material Generation},
author={Tian Xie and Xiang Fu and Octavian-Eugen Ganea and Regina Barzilay and Tommi Jaakkola},
year={2021},
eprint={2110.06197},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```
and
```
@misc{carbon2020data,
doi = {10.24435/MATERIALSCLOUD:2020.0026/V1},
url = {https://archive.materialscloud.org/record/2020.0026/v1},
author = {Pickard, Chris J.},
keywords = {DFT, ab initio random structure searching, carbon},
language = {en},
title = {AIRSS data for carbon at 10GPa and the C+N+H+O system at 1GPa},
publisher = {Materials Cloud},
year = {2020},
copyright = {info:eu-repo/semantics/openAccess}
}
```
### Contributions
Thanks to [@albertvillanova](https://github.com/albertvillanova) for adding this dataset.
|
CanadianGamer/RPdata | ---
license: mit
---
|
open-llm-leaderboard/details_SCE__Mistral-7B-math-ia3-tuned | ---
pretty_name: Evaluation run of SCE/Mistral-7B-math-ia3-tuned
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [SCE/Mistral-7B-math-ia3-tuned](https://huggingface.co/SCE/Mistral-7B-math-ia3-tuned)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SCE__Mistral-7B-math-ia3-tuned\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-29T07:55:26.696001](https://huggingface.co/datasets/open-llm-leaderboard/details_SCE__Mistral-7B-math-ia3-tuned/blob/main/results_2024-01-29T07-55-26.696001.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.597094516437562,\n\
\ \"acc_stderr\": 0.033396196017173016,\n \"acc_norm\": 0.6014163034201743,\n\
\ \"acc_norm_stderr\": 0.03407797923224814,\n \"mc1\": 0.40636474908200737,\n\
\ \"mc1_stderr\": 0.017193835812093893,\n \"mc2\": 0.5807124282513559,\n\
\ \"mc2_stderr\": 0.015370155281237467\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5273037542662116,\n \"acc_stderr\": 0.014589589101985998,\n\
\ \"acc_norm\": 0.5725255972696246,\n \"acc_norm_stderr\": 0.014456862944650647\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6082453694483171,\n\
\ \"acc_stderr\": 0.004871447106554924,\n \"acc_norm\": 0.8079067914758016,\n\
\ \"acc_norm_stderr\": 0.003931408309245499\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849725,\n\
\ \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849725\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.6566037735849056,\n \"acc_stderr\": 0.02922452646912479,\n \
\ \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.02922452646912479\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5606936416184971,\n\
\ \"acc_stderr\": 0.037842719328874674,\n \"acc_norm\": 0.5606936416184971,\n\
\ \"acc_norm_stderr\": 0.037842719328874674\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.03265019475033582,\n\
\ \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.03265019475033582\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36507936507936506,\n \"acc_stderr\": 0.024796060602699944,\n \"\
acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.024796060602699944\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.043902592653775614,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.043902592653775614\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6709677419354839,\n \"acc_stderr\": 0.026729499068349958,\n \"\
acc_norm\": 0.6709677419354839,\n \"acc_norm_stderr\": 0.026729499068349958\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n \"\
acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198906,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198906\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.02578772318072387,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.02578772318072387\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5512820512820513,\n \"acc_stderr\": 0.025217315184846482,\n\
\ \"acc_norm\": 0.5512820512820513,\n \"acc_norm_stderr\": 0.025217315184846482\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.02911661760608301,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.02911661760608301\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7889908256880734,\n \"acc_stderr\": 0.01749392240411265,\n \"\
acc_norm\": 0.7889908256880734,\n \"acc_norm_stderr\": 0.01749392240411265\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7450980392156863,\n \"acc_stderr\": 0.030587591351604246,\n \"\
acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.030587591351604246\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293433,\n \
\ \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293433\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n\
\ \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.6233183856502242,\n\
\ \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467766,\n\
\ \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467766\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.020237149008990915,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.020237149008990915\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.789272030651341,\n\
\ \"acc_stderr\": 0.014583812465862538,\n \"acc_norm\": 0.789272030651341,\n\
\ \"acc_norm_stderr\": 0.014583812465862538\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.025624723994030454,\n\
\ \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.025624723994030454\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35307262569832404,\n\
\ \"acc_stderr\": 0.015984204545268565,\n \"acc_norm\": 0.35307262569832404,\n\
\ \"acc_norm_stderr\": 0.015984204545268565\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.027057974624494382,\n\
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.027057974624494382\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n\
\ \"acc_stderr\": 0.026664410886937613,\n \"acc_norm\": 0.6720257234726688,\n\
\ \"acc_norm_stderr\": 0.026664410886937613\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6419753086419753,\n \"acc_stderr\": 0.026675611926037103,\n\
\ \"acc_norm\": 0.6419753086419753,\n \"acc_norm_stderr\": 0.026675611926037103\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4432624113475177,\n \"acc_stderr\": 0.029634838473766006,\n \
\ \"acc_norm\": 0.4432624113475177,\n \"acc_norm_stderr\": 0.029634838473766006\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.423728813559322,\n\
\ \"acc_stderr\": 0.012620785155885998,\n \"acc_norm\": 0.423728813559322,\n\
\ \"acc_norm_stderr\": 0.012620785155885998\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.02928941340940319,\n\
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.02928941340940319\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5931372549019608,\n \"acc_stderr\": 0.019873802005061177,\n \
\ \"acc_norm\": 0.5931372549019608,\n \"acc_norm_stderr\": 0.019873802005061177\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065677,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6716417910447762,\n\
\ \"acc_stderr\": 0.033206858897443244,\n \"acc_norm\": 0.6716417910447762,\n\
\ \"acc_norm_stderr\": 0.033206858897443244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40636474908200737,\n\
\ \"mc1_stderr\": 0.017193835812093893,\n \"mc2\": 0.5807124282513559,\n\
\ \"mc2_stderr\": 0.015370155281237467\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7655880031570639,\n \"acc_stderr\": 0.011906130106237985\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4184988627748294,\n \
\ \"acc_stderr\": 0.013588287284030866\n }\n}\n```"
repo_url: https://huggingface.co/SCE/Mistral-7B-math-ia3-tuned
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|arc:challenge|25_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|gsm8k|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hellaswag|10_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T07-55-26.696001.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-29T07-55-26.696001.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- '**/details_harness|winogrande|5_2024-01-29T07-55-26.696001.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-29T07-55-26.696001.parquet'
- config_name: results
data_files:
- split: 2024_01_29T07_55_26.696001
path:
- results_2024-01-29T07-55-26.696001.parquet
- split: latest
path:
- results_2024-01-29T07-55-26.696001.parquet
---
# Dataset Card for Evaluation run of SCE/Mistral-7B-math-ia3-tuned
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SCE/Mistral-7B-math-ia3-tuned](https://huggingface.co/SCE/Mistral-7B-math-ia3-tuned) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SCE__Mistral-7B-math-ia3-tuned",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-29T07:55:26.696001](https://huggingface.co/datasets/open-llm-leaderboard/details_SCE__Mistral-7B-math-ia3-tuned/blob/main/results_2024-01-29T07-55-26.696001.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.597094516437562,
"acc_stderr": 0.033396196017173016,
"acc_norm": 0.6014163034201743,
"acc_norm_stderr": 0.03407797923224814,
"mc1": 0.40636474908200737,
"mc1_stderr": 0.017193835812093893,
"mc2": 0.5807124282513559,
"mc2_stderr": 0.015370155281237467
},
"harness|arc:challenge|25": {
"acc": 0.5273037542662116,
"acc_stderr": 0.014589589101985998,
"acc_norm": 0.5725255972696246,
"acc_norm_stderr": 0.014456862944650647
},
"harness|hellaswag|10": {
"acc": 0.6082453694483171,
"acc_stderr": 0.004871447106554924,
"acc_norm": 0.8079067914758016,
"acc_norm_stderr": 0.003931408309245499
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.03910525752849725,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.03910525752849725
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6566037735849056,
"acc_stderr": 0.02922452646912479,
"acc_norm": 0.6566037735849056,
"acc_norm_stderr": 0.02922452646912479
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5606936416184971,
"acc_stderr": 0.037842719328874674,
"acc_norm": 0.5606936416184971,
"acc_norm_stderr": 0.037842719328874674
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.024796060602699944,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.024796060602699944
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.043902592653775614,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.043902592653775614
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6709677419354839,
"acc_stderr": 0.026729499068349958,
"acc_norm": 0.6709677419354839,
"acc_norm_stderr": 0.026729499068349958
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198906,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198906
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.02578772318072387,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.02578772318072387
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5512820512820513,
"acc_stderr": 0.025217315184846482,
"acc_norm": 0.5512820512820513,
"acc_norm_stderr": 0.025217315184846482
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.02911661760608301,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.02911661760608301
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7889908256880734,
"acc_stderr": 0.01749392240411265,
"acc_norm": 0.7889908256880734,
"acc_norm_stderr": 0.01749392240411265
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293433,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293433
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467766,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467766
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.020237149008990915,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.020237149008990915
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.789272030651341,
"acc_stderr": 0.014583812465862538,
"acc_norm": 0.789272030651341,
"acc_norm_stderr": 0.014583812465862538
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.025624723994030454,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.025624723994030454
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35307262569832404,
"acc_stderr": 0.015984204545268565,
"acc_norm": 0.35307262569832404,
"acc_norm_stderr": 0.015984204545268565
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.027057974624494382,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.027057974624494382
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.026664410886937613,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.026664410886937613
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6419753086419753,
"acc_stderr": 0.026675611926037103,
"acc_norm": 0.6419753086419753,
"acc_norm_stderr": 0.026675611926037103
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4432624113475177,
"acc_stderr": 0.029634838473766006,
"acc_norm": 0.4432624113475177,
"acc_norm_stderr": 0.029634838473766006
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.423728813559322,
"acc_stderr": 0.012620785155885998,
"acc_norm": 0.423728813559322,
"acc_norm_stderr": 0.012620785155885998
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.02928941340940319,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.02928941340940319
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5931372549019608,
"acc_stderr": 0.019873802005061177,
"acc_norm": 0.5931372549019608,
"acc_norm_stderr": 0.019873802005061177
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065677,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6716417910447762,
"acc_stderr": 0.033206858897443244,
"acc_norm": 0.6716417910447762,
"acc_norm_stderr": 0.033206858897443244
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40636474908200737,
"mc1_stderr": 0.017193835812093893,
"mc2": 0.5807124282513559,
"mc2_stderr": 0.015370155281237467
},
"harness|winogrande|5": {
"acc": 0.7655880031570639,
"acc_stderr": 0.011906130106237985
},
"harness|gsm8k|5": {
"acc": 0.4184988627748294,
"acc_stderr": 0.013588287284030866
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
OpenDILabCommunity/LMDrive | ---
configs:
- config_name: default
data_files:
- split: train
path: navigation_instruction_list.txt
sep: " "
default: true
license: apache-2.0
language:
- en
size_categories:
- n>1T
---
# LMDrive 64K Dataset Card
LMDrive Dataset consists of 64K instruction-sensor-control data clips collected in the CARLA simulator, where each clip includes one navigation instruction, several notice instructions, a sequence of multi-modal multi-view sensor data, and control signals. The duration of the clip spans from 2 to 20 seconds.
## Dataset details
- `data/`: dataset folder, the entire dataset contains about 2T of data.
- `data/Town01`: sub dataset folder, which only consists of the data folder for the Town01
- `data/Town02`: sub dataset folder, which only consists of the data folder for the Town02
- ...
- `dataset_index.txt`: the data list for pretraining the vision encoder
- `navigation_instruction_list.txt`: the data list for instruction finetuning
- `notice_instruction_list.json`: the data list for instruction finetuning (optional if the notice instruction data is not engaged in the training)
**Dataset date:**
LMDrive-1.0 Dataset was collected in September 2023.
**Paper or resources for more information:**
Github: https://github.com/opendilab/LMDrive/README.md
Paper: https://arxiv.org/abs/2312.07488
**License:**
Attribution-NonCommercial 4.0 International
**Where to send questions or comments about the model:**
https://github.com/opendilab/LMDrive/issues
## Intended use
**Primary intended uses:**
The primary use of LMDrive is research on large multimodal models for autonomous driving.
**Primary intended users:**
The primary intended users of the model are researchers and hobbyists in computer vision, large multimodal model, autonomous driving, and artificial intelligence. |
qjckevin/Movielens_Prompt | ---
license: other
---
|
autoevaluate/autoeval-staging-eval-cnn_dailymail-3.0.0-25032a-15466140 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cnn_dailymail
eval_info:
task: summarization
model: SamuelAllen123/t5-efficient-large-nl36_fine_tuned_for_sum
metrics: ['mae', 'mse', 'rouge', 'squad']
dataset_name: cnn_dailymail
dataset_config: 3.0.0
dataset_split: train
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: SamuelAllen123/t5-efficient-large-nl36_fine_tuned_for_sum
* Dataset: cnn_dailymail
* Config: 3.0.0
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@samuelallen123](https://huggingface.co/samuelallen123) for evaluating this model. |
davanstrien/ia_example | ---
dataset_info:
features:
- name: url
dtype: string
- name: choice
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 8490139.0
num_examples: 113
download_size: 8470454
dataset_size: 8490139.0
---
# Dataset Card for "ia_example"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_stsb_serial_verb_give | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 757
num_examples: 4
- name: test
num_bytes: 538
num_examples: 2
- name: train
num_bytes: 449
num_examples: 2
download_size: 10835
dataset_size: 1744
---
# Dataset Card for "MULTI_VALUE_stsb_serial_verb_give"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
uproai/chat-90k | ---
license: openrail
task_categories:
- text2text-generation
- text-generation
language:
- en
- ja
- zh
tags:
- RP
---
# chat-90k v1.0
chat-90k is a dataset composed of role-play chat messages, featuring the following columns:
```
sender: message sender ID
aid: bot ID
kind: 1: user message, 2: bot message
content: message content
```
## Query with duckdb
```
import pandas as pd
import duckdb
localdatafile = 'messages.parquet'
df = duckdb.sql(f"select * from read_parquet('{localdatafile}')").to_df()
df
```
more examples: [colab](https://colab.research.google.com/drive/1cmNpsamcbELWnERICxBwsz3Bxi7eGIux?usp=sharing)
|
AkshilShah21/food_images | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Baked Potato
'1': Crispy Chicken
'2': Donut
'3': Fries
'4': Hot Dog
'5': Sandwich
'6': Taco
'7': Taquito
'8': apple_pie
'9': burger
'10': butter_naan
'11': chai
'12': chapati
'13': cheesecake
'14': chicken_curry
'15': chole_bhature
'16': dal_makhani
'17': dhokla
'18': fried_rice
'19': ice_cream
'20': idli
'21': jalebi
'22': kaathi_rolls
'23': kadai_paneer
'24': kulfi
'25': masala_dosa
'26': momos
'27': omelette
'28': paani_puri
'29': pakode
'30': pav_bhaji
'31': pizza
'32': samosa
'33': sushi
splits:
- name: train
num_bytes: 1232388907.4630044
num_examples: 19098
- name: test
num_bytes: 319184734.0839955
num_examples: 4775
download_size: 1820263555
dataset_size: 1551573641.547
---
# Dataset Card for "food_images"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
overflowwwww/yt-da-public-v2 | ---
task_categories:
- audio-classification
language:
- da
--- |
lewtun/helpful-anthropic-raw | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: demonstration
dtype: string
splits:
- name: train
num_bytes: 26008407
num_examples: 65842
download_size: 15735838
dataset_size: 26008407
---
# Dataset Card for "helpful-anthropic-raw"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
saattrupdan/womens-clothing-ecommerce-reviews | ---
dataset_info:
features:
- name: review_text
dtype: string
- name: age
dtype: int64
- name: rating
dtype: int64
- name: positive_feedback_count
dtype: int64
- name: division_name
dtype: string
- name: department_name
dtype: string
- name: class_name
dtype: string
- name: recommended_ind
dtype:
class_label:
names:
'0': '0'
'1': '1'
splits:
- name: train
num_bytes: 7811312.540347158
num_examples: 20641
- name: val
num_bytes: 378436.72982642107
num_examples: 1000
- name: test
num_bytes: 378436.72982642107
num_examples: 1000
download_size: 4357015
dataset_size: 8568186.0
task_categories:
- text-classification
language:
- en
tags:
- multimodal
pretty_name: Women's Clothing E-Commerce Reviews
size_categories:
- 1K<n<10K
---
# Dataset Card for "womens-clothing-ecommerce-reviews"
Processed version of [this dataset](https://github.com/ya-stack/Women-s-Ecommerce-Clothing-Reviews). |
sofiapaklina/grdmr_test_zoo_648292 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
license: cc-by-4.0
task_categories:
- text-generation
- text2text-generation
language:
- ru
tags:
- chat
size_categories:
- 10K<n<100K
--- |
DonGenialo/pixel_images_587 | ---
language:
- en
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 33059748.0
num_examples: 587
download_size: 30123106
dataset_size: 33059748.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
wefussell/amasum-temporal-df | ---
license: mit
---
|
Brendan/multiwoz_turns_v24 | ---
dataset_info:
features:
- name: dialogue_id
dtype: string
- name: turn_id
dtype: int64
- name: user
dtype: string
- name: system_response
dtype: string
- name: history
sequence: string
- name: system_acts
struct:
- name: Attraction-Inform
sequence:
sequence: string
- name: Attraction-NoOffer
sequence:
sequence: string
- name: Attraction-Recommend
sequence:
sequence: string
- name: Attraction-Request
sequence:
sequence: string
- name: Attraction-Select
sequence:
sequence: string
- name: Booking-Book
sequence:
sequence: string
- name: Booking-Inform
sequence:
sequence: string
- name: Booking-NoBook
sequence:
sequence: string
- name: Booking-Request
sequence:
sequence: string
- name: Hotel-Inform
sequence:
sequence: string
- name: Hotel-NoOffer
sequence:
sequence: string
- name: Hotel-Recommend
sequence:
sequence: string
- name: Hotel-Request
sequence:
sequence: string
- name: Hotel-Select
sequence:
sequence: string
- name: Restaurant-Inform
sequence:
sequence: string
- name: Restaurant-NoOffer
sequence:
sequence: string
- name: Restaurant-Recommend
sequence:
sequence: string
- name: Restaurant-Request
sequence:
sequence: string
- name: Restaurant-Select
sequence:
sequence: string
- name: Taxi-Inform
sequence:
sequence: string
- name: Taxi-Request
sequence:
sequence: string
- name: Train-Inform
sequence:
sequence: string
- name: Train-NoOffer
sequence:
sequence: string
- name: Train-OfferBook
sequence:
sequence: string
- name: Train-OfferBooked
sequence:
sequence: string
- name: Train-Request
sequence:
sequence: string
- name: Train-Select
sequence:
sequence: string
- name: general-bye
sequence:
sequence: string
- name: general-greet
sequence:
sequence: string
- name: general-reqmore
sequence:
sequence: string
- name: general-welcome
sequence:
sequence: string
- name: belief_state
sequence:
sequence: string
- name: prev_belief_state
sequence:
sequence: string
- name: belief_state_delta
sequence:
sequence: string
- name: degenerate_user
dtype: bool
splits:
- name: train
num_bytes: 71669619
num_examples: 56719
- name: validation
num_bytes: 9862893
num_examples: 7374
- name: test
num_bytes: 9864860
num_examples: 7368
download_size: 15883931
dataset_size: 91397372
---
# Dataset Card for "multiwoz_turns_v24"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
veerav96/sd302 | ---
license: apache-2.0
---
|
Cafet/main_train | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: input_length
dtype: int64
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 4244668440
num_examples: 16839
download_size: 4207060600
dataset_size: 4244668440
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ibndias/distilabel-capybara-dpo-7k-binarized | ---
dataset_info:
features:
- name: source
dtype: string
- name: conversation
list:
- name: input
dtype: string
- name: output
dtype: string
- name: original_response
dtype: string
- name: generation_prompt
sequence: string
- name: raw_generation_responses
sequence: string
- name: new_generations
sequence: string
- name: prompt
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rating_chosen
dtype: int64
- name: rating_rejected
dtype: int64
- name: chosen_model
dtype: string
- name: rejected_model
dtype: string
splits:
- name: train
num_bytes: 348791651
num_examples: 7563
download_size: 155776373
dataset_size: 348791651
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kaleemWaheed/twitter_dataset_1713039310 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 9896
num_examples: 23
download_size: 8656
dataset_size: 9896
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
WKLI22/scanbank_hf_small | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: height
dtype: int64
- name: width
dtype: int64
- name: objects
struct:
- name: area
sequence: int64
- name: bbox
sequence:
sequence: int64
- name: category
sequence: int64
- name: id
sequence: int64
splits:
- name: train
num_bytes: 161371748.81492063
num_examples: 2860
- name: test
num_bytes: 1889607.9215686275
num_examples: 22
download_size: 220120029
dataset_size: 163261356.73648927
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
McSpicyWithMilo/target-locations-0.2split-new-180 | ---
dataset_info:
features:
- name: target_location
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 17066.4
num_examples: 144
- name: test
num_bytes: 4266.6
num_examples: 36
download_size: 14677
dataset_size: 21333.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "target-locations-0.2split-new-180"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
valurank/spam_ham_comments | ---
license: other
license_name: valurank
license_link: LICENSE
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.