datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
sydthedev/labelled-posts-1k | ---
license: mit
---
|
autoevaluate/autoeval-eval-scientific_papers-pubmed-c3b6df-51381145313 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- scientific_papers
eval_info:
task: summarization
model: Samuel-Fipps/t5-efficient-large-nl36_fine_tune_sum_V2
metrics: ['accuracy', 'frugalscore']
dataset_name: scientific_papers
dataset_config: pubmed
dataset_split: train
col_mapping:
text: article
target: abstract
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: Samuel-Fipps/t5-efficient-large-nl36_fine_tune_sum_V2
* Dataset: scientific_papers
* Config: pubmed
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@NessTechIntl](https://huggingface.co/NessTechIntl) for evaluating this model. |
roleplay4fun/bot_configs | ---
dataset_info:
features:
- name: bot_name
dtype: string
- name: user_name
dtype: string
- name: persona
dtype: string
- name: multi_personas
sequence: 'null'
- name: demos
dtype: string
- name: scenario
dtype: string
- name: first_message
dtype: string
- name: tags
sequence: string
- name: source
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 207285.78
num_examples: 20
download_size: 130949
dataset_size: 207285.78
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sunrise110/loraMode | ---
license: apache-2.0
---
|
qgiaohc/twitter_dataset_1713125113 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 31467
num_examples: 74
download_size: 17287
dataset_size: 31467
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CVasNLPExperiments/VQAv2_minival_validation_google_flan_t5_xxl_mode_D_PNP_GENERIC_Q_rices_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large_clean_
num_bytes: 143060
num_examples: 1000
download_size: 53460
dataset_size: 143060
---
# Dataset Card for "VQAv2_minival_validation_google_flan_t5_xxl_mode_D_PNP_GENERIC_Q_rices_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SciPhi/AgentSearch-V1 | ---
language:
- en
size_categories:
- 1B<n<10B
task_categories:
- text-generation
pretty_name: AgentSearch-V1
configs:
- config_name: default
data_files:
- split: train
path: "**/*.parquet"
---
### Getting Started
The AgentSearch-V1 dataset boasts a comprehensive collection of over one billion embeddings, produced using [jina-v2-base](https://huggingface.co/jinaai/jina-embeddings-v2-base-en). The dataset encompasses more than 50 million high-quality documents and over 1 billion passages, covering a vast range of content from sources such as Arxiv, Wikipedia, Project Gutenberg, and includes carefully filtered Creative Commons (CC) data. Our team is dedicated to continuously expanding and enhancing this corpus to improve the search experience. We welcome your thoughts and suggestions – please feel free to reach out with your ideas!
To access and utilize the AgentSearch-V1 dataset, you can stream it via HuggingFace with the following Python code:
```python
from datasets import load_dataset
import json
import numpy as np
# To stream the entire dataset:
ds = load_dataset("SciPhi/AgentSearch-V1", data_files="**/*", split="train", streaming=True)
# Optional, stream just the "arxiv" dataset
# ds = load_dataset("SciPhi/AgentSearch-V1", data_files="**/*", split="train", data_files="arxiv/*", streaming=True)
# To process the entries:
for entry in ds:
embeddings = np.frombuffer(
entry['embeddings'], dtype=np.float32
).reshape(-1, 768)
text_chunks = json.loads(entry['text_chunks'])
metadata = json.loads(entry['metadata'])
print(f'Embeddings:\n{embeddings}\n\nChunks:\n{text_chunks}\n\nMetadata:\n{metadata}')
break
```
---
A full set of scripts to recreate the dataset from scratch can be found [here](https://github.com/SciPhi-AI/agent-search). Further, you may check the docs for details on how to perform RAG over AgentSearch.
### Languages
English.
## Dataset Structure
The raw dataset structure is as follows:
```json
{
"url": ...,
"title": ...,
"metadata": {"url": "...", "timestamp": "...", "source": "...", "language": "...", ...},
"text_chunks": ...,
"embeddings": ...,
"dataset": "book" | "arxiv" | "wikipedia" | "stack-exchange" | "open-math" | "RedPajama-Data-V2"
}
```
## Dataset Creation
This dataset was created as a step towards making humanities most important knowledge openly searchable and LLM optimal. It was created by filtering, cleaning, and augmenting locally publicly available datasets.
To cite our work, please use the following:
```
@software{SciPhi2023AgentSearch,
author = {SciPhi},
title = {AgentSearch [ΨΦ]: A Comprehensive Agent-First Framework and Dataset for Webscale Search},
year = {2023},
url = {https://github.com/SciPhi-AI/agent-search}
}
```
### Source Data
```
@ONLINE{wikidump,
author = "Wikimedia Foundation",
title = "Wikimedia Downloads",
url = "https://dumps.wikimedia.org"
}
```
```
@misc{paster2023openwebmath,
title={OpenWebMath: An Open Dataset of High-Quality Mathematical Web Text},
author={Keiran Paster and Marco Dos Santos and Zhangir Azerbayev and Jimmy Ba},
year={2023},
eprint={2310.06786},
archivePrefix={arXiv},
primaryClass={cs.AI}
}
```
```
@software{together2023redpajama,
author = {Together Computer},
title = {RedPajama: An Open Source Recipe to Reproduce LLaMA training dataset},
month = April,
year = 2023,
url = {https://github.com/togethercomputer/RedPajama-Data}
}
```
### License
Please refer to the licenses of the data subsets you use.
* [Open-Web (Common Crawl Foundation Terms of Use)](https://commoncrawl.org/terms-of-use/full/)
* Books: [the_pile_books3 license](https://huggingface.co/datasets/the_pile_books3#licensing-information) and [pg19 license](https://huggingface.co/datasets/pg19#licensing-information)
* [ArXiv Terms of Use](https://info.arxiv.org/help/api/tou.html)
* [Wikipedia License](https://huggingface.co/datasets/wikipedia#licensing-information)
* [StackExchange license on the Internet Archive](https://archive.org/details/stackexchange)
<!--
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
--> |
Jongmin123/aicontest | ---
license: unknown
---
|
mole-code/com.theokanning.openai | ---
dataset_info:
features:
- name: code
dtype: string
- name: apis
sequence: string
- name: extract_api
dtype: string
splits:
- name: train
num_bytes: 2736788
num_examples: 438
- name: test
num_bytes: 150175
num_examples: 25
download_size: 932776
dataset_size: 2886963
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Nexdata/8178_Chinese_Social_Comments_Events_Annotation_Data | ---
license: cc-by-nc-nd-4.0
language:
- zh
---
## Description
8,178 Chinese social comments annotated data. The contents are hot news in 2013. Each piece of news contains one or more events and is annotated with time, theme, cause, procedure and result. The data is stored in xml and can be used for natural language understanding.
For more details, please refer to the link: https://www.nexdata.ai/dataset/83?source=Huggingface
# Specifications
## Data content
Chinese social comments Events Annotation Data
## Data size
8,178 pieces
## Annotation content
Element words of events
## Collecting period
May 2,013
## Storage format
xml
## Language
Chinese
## Data category
Event extraction
# Licensing Information
Commercial License |
autoevaluate/autoeval-eval-lener_br-lener_br-14b0f6-1886164287 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- lener_br
eval_info:
task: entity_extraction
model: Luciano/xlm-roberta-base-finetuned-lener_br-finetuned-lener-br
metrics: []
dataset_name: lener_br
dataset_config: lener_br
dataset_split: train
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: Luciano/xlm-roberta-base-finetuned-lener_br-finetuned-lener-br
* Dataset: lener_br
* Config: lener_br
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Luciano](https://huggingface.co/Luciano) for evaluating this model. |
liuyanchen1015/MULTI_VALUE_mrpc_bare_past_tense | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 327219
num_examples: 1210
- name: train
num_bytes: 694247
num_examples: 2565
- name: validation
num_bytes: 77173
num_examples: 280
download_size: 732668
dataset_size: 1098639
---
# Dataset Card for "MULTI_VALUE_mrpc_bare_past_tense"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Cherishh/asr-slu | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
dataset_info:
features:
- name: speech
sequence: float64
- name: sampling_rate
dtype: int64
- name: target_text
dtype: string
splits:
- name: train
num_bytes: 3131199570
num_examples: 6002
- name: val
num_bytes: 351773643
num_examples: 667
- name: test
num_bytes: 380367632
num_examples: 741
download_size: 916274597
dataset_size: 3863340845
---
# Dataset Card for "asr-slu"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nzindoc/dataset-multiple-myeloma | ---
license: apache-2.0
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 903374
num_examples: 1012
download_size: 75259
dataset_size: 903374
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
HPGomes/ModelodevozDinho | ---
license: openrail
---
|
CVasNLPExperiments/docvqa_test_google_flan_t5_xxl_mode_OCR_VQA_Q_rices_ns_5188 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0
num_bytes: 413408
num_examples: 5188
download_size: 219038
dataset_size: 413408
configs:
- config_name: default
data_files:
- split: fewshot_0
path: data/fewshot_0-*
---
|
BirdL/DONOTUSEDATA-SideA | ---
dataset_info:
features:
- name: text
dtype: string
- name: sexual
dtype: float64
- name: hate
dtype: float64
- name: violence
dtype: float64
- name: self-harm
dtype: float64
- name: sexual/minors
dtype: float64
- name: hate/threatening
dtype: float64
- name: violence/graphic
dtype: float64
splits:
- name: train
num_bytes: 8256999
num_examples: 30002
download_size: 6382984
dataset_size: 8256999
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
tags:
- not-for-all-audiences
---
# Dataset Card for "DONOTUSEDATA"
Studying the effects of harmful data on LLMs. Side A.
Filtered Subset of [kjj0/4chanpol-openai](https://huggingface.co/datasets/kjj0/4chanpol-openaimod)
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_cookinai__Valkyrie-V1 | ---
pretty_name: Evaluation run of cookinai/Valkyrie-V1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cookinai/Valkyrie-V1](https://huggingface.co/cookinai/Valkyrie-V1) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cookinai__Valkyrie-V1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-30T01:47:44.529277](https://huggingface.co/datasets/open-llm-leaderboard/details_cookinai__Valkyrie-V1/blob/main/results_2023-12-30T01-47-44.529277.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6522868637521901,\n\
\ \"acc_stderr\": 0.032133209567569515,\n \"acc_norm\": 0.6522561915341794,\n\
\ \"acc_norm_stderr\": 0.03280005450279144,\n \"mc1\": 0.43451652386780903,\n\
\ \"mc1_stderr\": 0.017352738749259564,\n \"mc2\": 0.603958710710944,\n\
\ \"mc2_stderr\": 0.01501017049153533\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6424914675767918,\n \"acc_stderr\": 0.014005494275916573,\n\
\ \"acc_norm\": 0.6723549488054608,\n \"acc_norm_stderr\": 0.013715847940719337\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6709818761202948,\n\
\ \"acc_stderr\": 0.004688963175758131,\n \"acc_norm\": 0.8626767576180043,\n\
\ \"acc_norm_stderr\": 0.003434848525388187\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411018,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411018\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.025487187147859375,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.025487187147859375\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721175,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721175\n \
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.02912652283458682,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.02912652283458682\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6897435897435897,\n \"acc_stderr\": 0.02345467488940429,\n \
\ \"acc_norm\": 0.6897435897435897,\n \"acc_norm_stderr\": 0.02345467488940429\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188704,\n \
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188704\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092434,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092434\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538271,\n \"\
acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538271\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290913,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290913\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n\
\ \"acc_stderr\": 0.013265346261323792,\n \"acc_norm\": 0.8352490421455939,\n\
\ \"acc_norm_stderr\": 0.013265346261323792\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508283,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508283\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4223463687150838,\n\
\ \"acc_stderr\": 0.016519594275297117,\n \"acc_norm\": 0.4223463687150838,\n\
\ \"acc_norm_stderr\": 0.016519594275297117\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n\
\ \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4397163120567376,\n \"acc_stderr\": 0.029609912075594106,\n \
\ \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.029609912075594106\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n\
\ \"acc_stderr\": 0.012745204626083133,\n \"acc_norm\": 0.46870925684485004,\n\
\ \"acc_norm_stderr\": 0.012745204626083133\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507208,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507208\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.43451652386780903,\n\
\ \"mc1_stderr\": 0.017352738749259564,\n \"mc2\": 0.603958710710944,\n\
\ \"mc2_stderr\": 0.01501017049153533\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8145224940805051,\n \"acc_stderr\": 0.010923965303140505\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7134192570128886,\n \
\ \"acc_stderr\": 0.012454841668337692\n }\n}\n```"
repo_url: https://huggingface.co/cookinai/Valkyrie-V1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|arc:challenge|25_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|gsm8k|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hellaswag|10_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T01-47-44.529277.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T01-47-44.529277.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- '**/details_harness|winogrande|5_2023-12-30T01-47-44.529277.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-30T01-47-44.529277.parquet'
- config_name: results
data_files:
- split: 2023_12_30T01_47_44.529277
path:
- results_2023-12-30T01-47-44.529277.parquet
- split: latest
path:
- results_2023-12-30T01-47-44.529277.parquet
---
# Dataset Card for Evaluation run of cookinai/Valkyrie-V1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cookinai/Valkyrie-V1](https://huggingface.co/cookinai/Valkyrie-V1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cookinai__Valkyrie-V1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T01:47:44.529277](https://huggingface.co/datasets/open-llm-leaderboard/details_cookinai__Valkyrie-V1/blob/main/results_2023-12-30T01-47-44.529277.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6522868637521901,
"acc_stderr": 0.032133209567569515,
"acc_norm": 0.6522561915341794,
"acc_norm_stderr": 0.03280005450279144,
"mc1": 0.43451652386780903,
"mc1_stderr": 0.017352738749259564,
"mc2": 0.603958710710944,
"mc2_stderr": 0.01501017049153533
},
"harness|arc:challenge|25": {
"acc": 0.6424914675767918,
"acc_stderr": 0.014005494275916573,
"acc_norm": 0.6723549488054608,
"acc_norm_stderr": 0.013715847940719337
},
"harness|hellaswag|10": {
"acc": 0.6709818761202948,
"acc_stderr": 0.004688963175758131,
"acc_norm": 0.8626767576180043,
"acc_norm_stderr": 0.003434848525388187
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.025487187147859375,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.025487187147859375
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.031234752377721175,
"acc_norm": 0.8,
"acc_norm_stderr": 0.031234752377721175
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.02912652283458682,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.02912652283458682
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644237,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6897435897435897,
"acc_stderr": 0.02345467488940429,
"acc_norm": 0.6897435897435897,
"acc_norm_stderr": 0.02345467488940429
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.02995382389188704,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.02995382389188704
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092434,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092434
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290913,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290913
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8352490421455939,
"acc_stderr": 0.013265346261323792,
"acc_norm": 0.8352490421455939,
"acc_norm_stderr": 0.013265346261323792
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508283,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.023786203255508283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4223463687150838,
"acc_stderr": 0.016519594275297117,
"acc_norm": 0.4223463687150838,
"acc_norm_stderr": 0.016519594275297117
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.029609912075594106,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.029609912075594106
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46870925684485004,
"acc_stderr": 0.012745204626083133,
"acc_norm": 0.46870925684485004,
"acc_norm_stderr": 0.012745204626083133
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507208,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507208
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.43451652386780903,
"mc1_stderr": 0.017352738749259564,
"mc2": 0.603958710710944,
"mc2_stderr": 0.01501017049153533
},
"harness|winogrande|5": {
"acc": 0.8145224940805051,
"acc_stderr": 0.010923965303140505
},
"harness|gsm8k|5": {
"acc": 0.7134192570128886,
"acc_stderr": 0.012454841668337692
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Astr0nautico/joaogomes | ---
license: openrail
---
|
akoukas/autextification2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': generated
'1': human
splits:
- name: train
num_bytes: 8606540.8
num_examples: 27076
- name: test
num_bytes: 1075976.533018171
num_examples: 3385
- name: validation
num_bytes: 1075658.6669818289
num_examples: 3384
download_size: 6332520
dataset_size: 10758176.000000002
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
skrishna/coin_flip_2_transformed | ---
dataset_info:
features:
- name: targets
dtype: string
- name: targets_vec
sequence: int64
- name: inputs
dtype: string
- name: text
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 519914
num_examples: 2000
- name: test
num_bytes: 521668
num_examples: 2000
download_size: 206066
dataset_size: 1041582
---
# Dataset Card for "coin_flip_2_transformed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tonytan48/Re-DocRED | ---
license: mit
---
# Re-DocRED Dataset
This repository contains the dataset of our EMNLP 2022 research paper [Revisiting DocRED – Addressing the False Negative Problem
in Relation Extraction](https://arxiv.org/pdf/2205.12696.pdf).
DocRED is a widely used benchmark for document-level relation extraction. However, the DocRED dataset contains a significant percentage of false negative examples (incomplete annotation). We revised 4,053 documents in the DocRED dataset and resolved its problems. We released this dataset as: Re-DocRED dataset.
The Re-DocRED Dataset resolved the following problems of DocRED:
1. Resolved the incompleteness problem by supplementing large amounts of relation triples.
2. Addressed the logical inconsistencies in DocRED.
3. Corrected the coreferential errors within DocRED.
# Statistics of Re-DocRED
The Re-DocRED dataset is located as ./data directory, the statistics of the dataset are shown below:
| | Train | Dev | Test |
| :---: | :-: | :-: |:-: |
| # Documents | 3,053 | 500 | 500 |
| Avg. # Triples | 28.1 | 34.6 | 34.9 |
| Avg. # Entities | 19.4 | 19.4 | 19.6 |
| Avg. # Sents | 7.9 | 8.2 | 7.9 |
# Citation
If you find our work useful, please cite our work as:
```bibtex
@inproceedings{tan2022revisiting,
title={Revisiting DocRED – Addressing the False Negative Problem in Relation Extraction},
author={Tan, Qingyu and Xu, Lu and Bing, Lidong and Ng, Hwee Tou and Aljunied, Sharifah Mahani},
booktitle={Proceedings of EMNLP},
url={https://arxiv.org/abs/2205.12696},
year={2022}
}
```
|
malhajar/distilabel-intel-orca-dpo-pairs-tr | ---
language:
- tr
license: mit
size_categories:
- 10M<n<100M
task_categories:
- text-classification
- token-classification
- table-question-answering
- question-answering
- zero-shot-classification
- summarization
- feature-extraction
- text-generation
- text2text-generation
pretty_name: OrcaDPO
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: system
dtype: string
- name: question
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: generations
sequence: string
- name: order
sequence: string
- name: labelling_model
dtype: string
- name: labelling_prompt
list:
- name: content
dtype: string
- name: role
dtype: string
- name: raw_labelling_response
dtype: string
- name: rating
sequence: float64
- name: rationale
dtype: string
- name: status
dtype: string
- name: original_chosen
dtype: string
- name: original_rejected
dtype: string
- name: chosen_score
dtype: float64
- name: in_gsm8k_train
dtype: bool
splits:
- name: train
num_bytes: 97012722.42875265
num_examples: 9120
download_size: 43511007
dataset_size: 97012722.42875265
---
# Dataset Card for "malhajar/orca_dpo_pairs-tr"
This Dataset is part of a series of datasets aimed at advancing Turkish LLM Developments by establishing rigid Turkish dataset collection to enhance the performance of LLM's Produced in the Turkish Language.
malhajar/orca_dpo_pairs-tr is a translated version of [`argilla/distilabel-intel-orca-dpo-pairs`]( https://huggingface.co/datasets/argilla/distilabel-intel-orca-dpo-pairs)
**Translated by:** [`Mohamad Alhajar`](https://www.linkedin.com/in/muhammet-alhajar/)
### Dataset Summary
This is a pre-processed version of the [OpenOrca dataset](https://huggingface.co/datasets/Open-Orca/OpenOrca) translated to Turkish.
The original OpenOrca dataset is a collection of augmented FLAN data that aligns, as best as possible, with the distributions outlined in the [Orca paper](https://arxiv.org/abs/2306.02707).
It has been instrumental in generating high-performing preference-tuned model checkpoints and serves as a valuable resource for all NLP researchers and developers!
# Dataset Summary
The OrcaDPO Pair dataset is a subset of the OpenOrca dataset suitable for DPO preference tuning. The dataset is stored in parquet format with each entry using the following schema:
:
```
{
'prompt': 'Read the following paragraph and determine if the hypothesis is true:\n\nWorld leaders expressed concern on Thursday that North Ko...'
'chosen': [
{'content': 'You are a helpful assistant, who always provide explanation. Think like you are answering to a five year old.',
'role': 'system'
},
{'content': 'Read the following paragraph and determine if the hypothesis is true...',
'role': 'user'
},
{'content': 'Okay little buddy, let\'s look at this...',
'role': 'assistant'
}
],
'rejected': [
{'content': 'You are a helpful assistant, who always provide explanation. Think like you are answering to a five year old.',
'role': 'system'
},
{'content': 'Read the following paragraph and determine if the hypothesis is true...',
'role': 'user'
},
{'content': 'Oh my gosh! Let me see if I can help you with that! ...',
'role': 'assistant'
}
],
}
```
### Data Splits
The dataset consists of two splits, `"train_prefs"` and `"test_prefs"`:
| train_prefs | test_prefs |
|:-------:|:-----------:|
| 12359 | 500 |
### Usage
To load the dataset, run:
```python
from datasets import load_dataset
ds = load_dataset("malhajar/distilabel-intel-orca-dpo-pairs-tr")
```
<a name="languages"></a>
# Languages
The language of the data is primarily Turkish.
<a name="dataset-structure"></a>
`# Citation
```bibtex
@misc{OpenOrca,
title = {OpenOrca: An Open Dataset of GPT Augmented FLAN Reasoning Traces},
author = {Wing Lian and Bleys Goodson and Eugene Pentland and Austin Cook and Chanvichet Vong and "Teknium"},
year = {2023},
publisher = {HuggingFace},
journal = {HuggingFace repository},
howpublished = {\url{https://https://huggingface.co/Open-Orca/OpenOrca}},
}
```
|
mylesmharrison/cornell-movie-dialog | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 21363514
num_examples: 304713
download_size: 13073496
dataset_size: 21363514
---
# Dataset Card for "cornell-movie-dialog"
This is a reduced version of the [Cornell Movie Dialog Corpus](https://www.cs.cornell.edu/~cristian/Cornell_Movie-Dialogs_Corpus.html) by Cristian Danescu-Niculescu-Mizil.
The original dataset contains 220,579 conversational exchanges between 10,292 pairs of movie characters, involving 9,035 characters from 617 movies for a total 304,713 utterances.
This reduced version of the dataset contains only the character tags and utterances from the `movie_lines.txt` file, with one utterance per line, suitable for training generative text models.
## Dataset Description
- **Homepage:** https://www.cs.cornell.edu/~cristian/Cornell_Movie-Dialogs_Corpus.html
- **Repository:** https://convokit.cornell.edu/documentation/movie.html
- **Paper:** [Chameleons in imagined conversations: A new approach to understanding
coordination of linguistic style in dialogs](https://www.cs.cornell.edu/~cristian/papers/chameleons.pdf)
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
|
timescale/wikipedia-22-12-simple-embeddings | ---
configs:
- config_name: default
data_files:
- split: train
path: wiki.csv
license: apache-2.0
task_categories:
- text-retrieval
language:
- en
---
# wikipedia-22-12-simple-embeddings
A modified version of [Cohere/wikipedia-22-12-simple-embeddings](https://huggingface.co/datasets/Cohere/wikipedia-22-12-simple-embeddings)
meant for use with PostgreSQL with pgvector and Timescale Vector.
## Dataset Details
This dataset was created for exploring time-based filtering and semantic search in PostgreSQL with pgvector and Timescale Vector.
This is a modified version of the [Cohere wikipedia-22-12-simple-embeddings dataset hosted on Huggingface](https://huggingface.co/datasets/Cohere/wikipedia-22-12-simple-embeddings).
It contains embeddings of [Simple English Wikipedia](https://simple.wikipedia.org/) entries.
We added synthetic data: a time column, category, and tags.
We loaded the data into a postgres table and exported it to a CSV file; therefore, the format has changed.
The original dataset is available under the Apache 2.0 license, and thus, our modified version is also subject to the Apache 2.0 license.
|
erhwenkuo/medical_dialogue-chinese-zhtw | ---
language:
- zh
license: mit
size_categories:
- 100K<n<1M
task_categories:
- text-generation
dataset_info:
features:
- name: output
dtype: string
- name: input
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 553726613
num_examples: 799743
download_size: 288686981
dataset_size: 553726613
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "medical_dialogue-chinese-zhtw"
中文醫療問答資料集
## 來源
本資料集是從 [Toyhom/Chinese-medical-dialogue-data](https://github.com/Toyhom/Chinese-medical-dialogue-data) 的 github repo 中轉換而來。
## 內容
|科別|數量|
|---|---|
**Andriatria 男科** |94,596 個問答對|
**IM 內科** |220,606 個問答對|
**OAGD 婦產科** |183,751 個問答對|
**Oncology 腫瘤科** |75,553 個問答對|
**Pediatric 兒科** |101,602 個問答對|
**Surgical 外科** |115,991 個問答對|
總計 792,099 條數據
## 範例
```json
{
"instruction": "現在你是個神經腦外科醫生,請根據病人的問題給予建議:",
"input": "癲癇病能吃德巴金嗎,錯覺,有時候感覺看到的和聽到的不太一樣。",
"output": "巴金是廣譜抗病藥物,主要作用於中樞神經系統,對動物的藥理研究發現德巴金對各種癲癇的實驗模型(全身性和局部性)均有抗驚厥作用,對人的各種類型癲癇發作有抑製作用,作用機理可能與增加γ-氨基丁酸的濃度有關。主要是治癲癇藥物。建議在醫生的知道下,用藥,祝您身體早日康復。"
}
```
## 欄位:
```
instruction: 指令
input: 輸入
output: 輸出
```
## 使用限制
此資料集用於研究大型語言模型的目的,不得用於會對社會帶來危害的用途。
本資料集不代表任何一方的立場、利益或想法,無關任何團體的任何類型的主張。因使用本資料集所帶來的任何損害、糾紛,本專案不承擔任何責任。 |
causalnlp/corr2cause | ---
configs:
- config_name: default
data_files:
- split: train
path: train.csv
- split: test
path: test.csv
- split: validation
path: dev.csv
- config_name: perturbation_by_paraphrasing
data_files:
- split: train
path: perturbation_by_paraphrasing_train.csv
- split: test
path: perturbation_by_paraphrasing_test.csv
- split: validation
path: perturbation_by_paraphrasing_dev.csv
- config_name: perturbation_by_refactorization
data_files:
- split: train
path: perturbation_by_refactorization_train.csv
- split: test
path: perturbation_by_refactorization_test.csv
- split: validation
path: perturbation_by_refactorization_dev.csv
---
# Dataset card for corr2cause
TODO |
A-Bar/de-nl_non_top_cs_dev | ---
dataset_info:
features:
- name: query
dtype: string
- name: passage
dtype: string
- name: label
dtype: float64
splits:
- name: train
num_bytes: 42018941
num_examples: 100000
download_size: 17480084
dataset_size: 42018941
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
cmeraki/wiki_en_hi | ---
license: apache-2.0
---
A subset of wikitext combined with a sample of hindi wikipedia articles. Each line contains a paragraph from the article. |
open-llm-leaderboard/details_seyf1elislam__WestKunai-Hermes-7b | ---
pretty_name: Evaluation run of seyf1elislam/WestKunai-Hermes-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [seyf1elislam/WestKunai-Hermes-7b](https://huggingface.co/seyf1elislam/WestKunai-Hermes-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_seyf1elislam__WestKunai-Hermes-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-16T07:13:17.960407](https://huggingface.co/datasets/open-llm-leaderboard/details_seyf1elislam__WestKunai-Hermes-7b/blob/main/results_2024-03-16T07-13-17.960407.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6528732572325661,\n\
\ \"acc_stderr\": 0.03200409908508436,\n \"acc_norm\": 0.6526927558383714,\n\
\ \"acc_norm_stderr\": 0.03266609121711541,\n \"mc1\": 0.49938800489596086,\n\
\ \"mc1_stderr\": 0.01750348793889251,\n \"mc2\": 0.6525182868612888,\n\
\ \"mc2_stderr\": 0.01524337736119199\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6808873720136519,\n \"acc_stderr\": 0.01362169611917331,\n\
\ \"acc_norm\": 0.71160409556314,\n \"acc_norm_stderr\": 0.013238394422428176\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7037442740489942,\n\
\ \"acc_stderr\": 0.004556719864763071,\n \"acc_norm\": 0.8776140211113324,\n\
\ \"acc_norm_stderr\": 0.003270612753613403\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493875,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493875\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n\
\ \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356852,\n \"\
acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356852\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"\
acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229865,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229865\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083008,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083008\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188703,\n \
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188703\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.01570349834846177,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.01570349834846177\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553346,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553346\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128137,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128137\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n\
\ \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n\
\ \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38100558659217876,\n\
\ \"acc_stderr\": 0.016242028834053616,\n \"acc_norm\": 0.38100558659217876,\n\
\ \"acc_norm_stderr\": 0.016242028834053616\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035454,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035454\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n\
\ \"acc_stderr\": 0.012751075788015055,\n \"acc_norm\": 0.4726205997392438,\n\
\ \"acc_norm_stderr\": 0.012751075788015055\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7022058823529411,\n \"acc_stderr\": 0.02777829870154544,\n\
\ \"acc_norm\": 0.7022058823529411,\n \"acc_norm_stderr\": 0.02777829870154544\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \
\ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.49938800489596086,\n\
\ \"mc1_stderr\": 0.01750348793889251,\n \"mc2\": 0.6525182868612888,\n\
\ \"mc2_stderr\": 0.01524337736119199\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8303078137332282,\n \"acc_stderr\": 0.010549542647363705\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.690674753601213,\n \
\ \"acc_stderr\": 0.01273171092507814\n }\n}\n```"
repo_url: https://huggingface.co/seyf1elislam/WestKunai-Hermes-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|arc:challenge|25_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|gsm8k|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hellaswag|10_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-16T07-13-17.960407.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-16T07-13-17.960407.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- '**/details_harness|winogrande|5_2024-03-16T07-13-17.960407.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-16T07-13-17.960407.parquet'
- config_name: results
data_files:
- split: 2024_03_16T07_13_17.960407
path:
- results_2024-03-16T07-13-17.960407.parquet
- split: latest
path:
- results_2024-03-16T07-13-17.960407.parquet
---
# Dataset Card for Evaluation run of seyf1elislam/WestKunai-Hermes-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [seyf1elislam/WestKunai-Hermes-7b](https://huggingface.co/seyf1elislam/WestKunai-Hermes-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_seyf1elislam__WestKunai-Hermes-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-16T07:13:17.960407](https://huggingface.co/datasets/open-llm-leaderboard/details_seyf1elislam__WestKunai-Hermes-7b/blob/main/results_2024-03-16T07-13-17.960407.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6528732572325661,
"acc_stderr": 0.03200409908508436,
"acc_norm": 0.6526927558383714,
"acc_norm_stderr": 0.03266609121711541,
"mc1": 0.49938800489596086,
"mc1_stderr": 0.01750348793889251,
"mc2": 0.6525182868612888,
"mc2_stderr": 0.01524337736119199
},
"harness|arc:challenge|25": {
"acc": 0.6808873720136519,
"acc_stderr": 0.01362169611917331,
"acc_norm": 0.71160409556314,
"acc_norm_stderr": 0.013238394422428176
},
"harness|hellaswag|10": {
"acc": 0.7037442740489942,
"acc_stderr": 0.004556719864763071,
"acc_norm": 0.8776140211113324,
"acc_norm_stderr": 0.003270612753613403
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493875,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493875
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.02548718714785938,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.02548718714785938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356852,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229865,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229865
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083008,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083008
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.02995382389188703,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.02995382389188703
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.01570349834846177,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.01570349834846177
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553346,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553346
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128137,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8326947637292464,
"acc_stderr": 0.013347327202920332,
"acc_norm": 0.8326947637292464,
"acc_norm_stderr": 0.013347327202920332
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38100558659217876,
"acc_stderr": 0.016242028834053616,
"acc_norm": 0.38100558659217876,
"acc_norm_stderr": 0.016242028834053616
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035454,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035454
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015055,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015055
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7022058823529411,
"acc_stderr": 0.02777829870154544,
"acc_norm": 0.7022058823529411,
"acc_norm_stderr": 0.02777829870154544
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.01904748523936038,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.01904748523936038
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857833,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857833
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.49938800489596086,
"mc1_stderr": 0.01750348793889251,
"mc2": 0.6525182868612888,
"mc2_stderr": 0.01524337736119199
},
"harness|winogrande|5": {
"acc": 0.8303078137332282,
"acc_stderr": 0.010549542647363705
},
"harness|gsm8k|5": {
"acc": 0.690674753601213,
"acc_stderr": 0.01273171092507814
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
DBQ/Hermes.Product.prices.Italy | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: Italy - Hermes - Product-level price list
tags:
- webscraping
- ecommerce
- Hermes
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: website_name
dtype: string
- name: competence_date
dtype: string
- name: country_code
dtype: string
- name: currency_code
dtype: string
- name: brand
dtype: string
- name: category1_code
dtype: string
- name: category2_code
dtype: string
- name: category3_code
dtype: int64
- name: product_code
dtype: string
- name: title
dtype: string
- name: itemurl
dtype: string
- name: imageurl
dtype: string
- name: full_price
dtype: float64
- name: price
dtype: float64
- name: full_price_eur
dtype: float64
- name: price_eur
dtype: float64
- name: flg_discount
dtype: int64
splits:
- name: train
num_bytes: 196574
num_examples: 535
download_size: 49870
dataset_size: 196574
---
# Hermes web scraped data
## About the website
The **EMEA luxury fashion industry**, particularly in **Italy**, is characterized by its high-end, high-quality retail products. Italy is home to many iconic fashion brands and is one of the worlds fashion capitals. **Hermes** is one of the prominent players in this sphere. With its sophisticated designs and high-quality craftsmanship, Hermes caters to a luxury-oriented demographic. The retail industry, more specifically, is making strategic shifts towards **Ecommerce**, stepping up their game in the digital market. The dataset observed provides insight into this shift, featuring **Ecommerce product-list page (PLP)** data of Hermes in Italy. This data provides an insight into Hermes online market dynamics, signalling its efforts to adapt to the increasingly evolving digital shopping trends.
## Link to **dataset**
[Italy - Hermes - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Hermes%20Product-prices%20Italy/r/recHVPAiKIshqPYKD)
|
bigscience-data/roots_en_odiencorp | ---
language: en
license: cc-by-nc-sa-4.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_en_odiencorp
# OdiEnCorp2.0
- Dataset uid: `odiencorp`
### Description
OdiEnCorp is a collection of Odia-English parallel and Odia monolingual sentences collected from different sources such as Odia Wikipedia, web sites, books, and dictionaries using different manual and machine learning techniques including web scraping and optical character recognition. OdiEnCorp 2.0 served in WAT 2020 EnglishOdia Indic Task.
### Homepage
https://lindat.mff.cuni.cz/repository/xmlui/handle/11234/1-3211
### Licensing
- non-commercial use
- cc-by-nc-sa-4.0: Creative Commons Attribution Non Commercial Share Alike 4.0 International
### Speaker Locations
- Southern Asia
- India
### Sizes
- 0.0043 % of total
- 2.2553 % of indic-or
- 0.0000 % of en
### BigScience processing steps
#### Filters applied to: indic-or
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
#### Filters applied to: en
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
|
mayflowergmbh/booksum_de | ---
task_categories:
- text-generation
language:
- de
---
A german translation for the [booksum](https://huggingface.co/datasets/togethercomputer/Long-Data-Collections) dataset.
Extracted from [seedboxventures/multitask_german_examples_32k](https://huggingface.co/datasets/seedboxventures/multitask_german_examples_32k).
Translation created by [seedbox ai](https://huggingface.co/seedboxai) for [KafkaLM](https://huggingface.co/seedboxai/KafkaLM-70B-German-V0.1) ❤️.
Available for finetuning in [hiyouga/LLaMA-Factory](https://github.com/hiyouga/LLaMA-Factory). |
arbml/L_HSAB | ---
dataset_info:
features:
- name: Tweet
dtype: string
- name: label
dtype:
class_label:
names:
0: null
1: abusive
2: hate
3: normal
splits:
- name: train
num_bytes: 1352345
num_examples: 5846
download_size: 566158
dataset_size: 1352345
---
# Dataset Card for "L_HSAB"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Junghans/techno_album | ---
license: openrail
---
|
Softage-AI/rlhf-qa_dataset | ---
license: mit
---
# RLHF Q&A Dataset With Citations
## Description
This dataset provides 133 prompts of various types, including coding, math, general knowledge, personal queries, and writing tasks. Each prompt is followed by an answer generated by an LLM and a human, along with ratings for fluency and perceived utility. This dataset, though limited in its size and scope, serves as an illustration of SoftAge's capabilities in the domain of RLHF for training AI language agents.
## Data attributes
- Query: Original prompt or question
- Answers: Responses generated by a large language model.
- Writer's Answer: The rewritten answer of the LLM curated by the domain expert human writer.
- Fluency Rating (1-7): Human rating of the answer's natural language flow
- Perceived Utility Rating (1-7): Human rating of the answer's helpfulness and relevance
- Links: Up to 7 links potentially relevant to the answer to the query.
## Limitations and Biases
- The dataset size might not comprehensively represent the full range of complexities within each query type.
- Human biases might influence the quality and ratings of the provided answers.
- The choice of the reference links might reflect the writer’s prior knowledge or search strategies.
## Potential Uses
- Training RLHF models to generate fluent and informative rewrites of multiple prompts.
- Evaluating the quality and effectiveness of RLHF models in different domains.
- Analyzing human preferences and biases in response generation and rating.
- Developing potential new metrics and evaluation methods for RLHF tasks. |
appvoid/simple-prompt-oasst | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 22163146
num_examples: 12947
download_size: 13029273
dataset_size: 22163146
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bimbom1310/NSMTrainData | ---
license: openrail
task_categories:
- token-classification
language:
- en
tags:
- code
pretty_name: NSM_Train_Data
size_categories:
- 1K<n<10K
--- |
bigscience-data/roots_fr_uncorpus | ---
language: fr
license: cc-by-4.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_fr_uncorpus
# uncorpus
- Dataset uid: `uncorpus`
### Description
### Homepage
### Licensing
### Speaker Locations
### Sizes
- 2.8023 % of total
- 10.7390 % of ar
- 5.7970 % of fr
- 9.7477 % of es
- 2.0417 % of en
- 1.2540 % of zh
### BigScience processing steps
#### Filters applied to: ar
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: fr
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: es
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: en
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: zh
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
|
kaleemWaheed/twitter_dataset_1713009750 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 10567
num_examples: 24
download_size: 9584
dataset_size: 10567
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_breadlicker45__dough-base-001 | ---
pretty_name: Evaluation run of breadlicker45/dough-base-001
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [breadlicker45/dough-base-001](https://huggingface.co/breadlicker45/dough-base-001)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_breadlicker45__dough-base-001\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-29T02:32:49.723963](https://huggingface.co/datasets/open-llm-leaderboard/details_breadlicker45__dough-base-001/blob/main/results_2023-10-29T02-32-49.723963.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"\
em_stderr\": 0.0,\n \"f1\": 0.0029163171140939564,\n \"f1_stderr\"\
: 0.00019355490209304062,\n \"acc\": 0.255327545382794,\n \"acc_stderr\"\
: 0.007024647268145198\n },\n \"harness|drop|3\": {\n \"em\": 0.0,\n\
\ \"em_stderr\": 0.0,\n \"f1\": 0.0029163171140939564,\n \"\
f1_stderr\": 0.00019355490209304062\n },\n \"harness|gsm8k|5\": {\n \
\ \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.510655090765588,\n \"acc_stderr\": 0.014049294536290396\n\
\ }\n}\n```"
repo_url: https://huggingface.co/breadlicker45/dough-base-001
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_29T02_00_15.543056
path:
- '**/details_harness|drop|3_2023-10-29T02-00-15.543056.parquet'
- split: 2023_10_29T02_32_49.723963
path:
- '**/details_harness|drop|3_2023-10-29T02-32-49.723963.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T02-32-49.723963.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_29T02_00_15.543056
path:
- '**/details_harness|gsm8k|5_2023-10-29T02-00-15.543056.parquet'
- split: 2023_10_29T02_32_49.723963
path:
- '**/details_harness|gsm8k|5_2023-10-29T02-32-49.723963.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-29T02-32-49.723963.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-12-28.280269.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-12-28.280269.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-12-28.280269.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_29T02_00_15.543056
path:
- '**/details_harness|winogrande|5_2023-10-29T02-00-15.543056.parquet'
- split: 2023_10_29T02_32_49.723963
path:
- '**/details_harness|winogrande|5_2023-10-29T02-32-49.723963.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-29T02-32-49.723963.parquet'
- config_name: results
data_files:
- split: 2023_10_03T17_12_28.280269
path:
- results_2023-10-03T17-12-28.280269.parquet
- split: 2023_10_29T02_00_15.543056
path:
- results_2023-10-29T02-00-15.543056.parquet
- split: 2023_10_29T02_32_49.723963
path:
- results_2023-10-29T02-32-49.723963.parquet
- split: latest
path:
- results_2023-10-29T02-32-49.723963.parquet
---
# Dataset Card for Evaluation run of breadlicker45/dough-base-001
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/breadlicker45/dough-base-001
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [breadlicker45/dough-base-001](https://huggingface.co/breadlicker45/dough-base-001) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_breadlicker45__dough-base-001",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T02:32:49.723963](https://huggingface.co/datasets/open-llm-leaderboard/details_breadlicker45__dough-base-001/blob/main/results_2023-10-29T02-32-49.723963.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.0029163171140939564,
"f1_stderr": 0.00019355490209304062,
"acc": 0.255327545382794,
"acc_stderr": 0.007024647268145198
},
"harness|drop|3": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.0029163171140939564,
"f1_stderr": 0.00019355490209304062
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.510655090765588,
"acc_stderr": 0.014049294536290396
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
hanifsyarubany10/JakartaSearch-IndoQA-gemma | ---
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: prompt
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 10670699
num_examples: 4333
download_size: 4886008
dataset_size: 10670699
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jonathan-roberts1/Brazilian_Cerrado-Savanna_Scenes | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': agriculture
'1': arboreal vegetation
'2': herbaceous vegetation
'3': shrubby vegetation
splits:
- name: train
num_bytes: 16933385.557
num_examples: 1311
download_size: 14574976
dataset_size: 16933385.557
license: other
task_categories:
- zero-shot-image-classification
- image-classification
---
# Dataset Card for "Brazilian_Cerrado-Savanna_Scenes"
## Dataset Description
- **Paper** [Towards vegetation species discrimination by using data-driven descriptors](https://vision.unipv.it/CV/materiale2016-17/3rd%20Choice/0022.pdf)
-
### Licensing Information
[CC BY-NC]
## Citation Information
[Towards vegetation species discrimination by using data-driven descriptors](https://vision.unipv.it/CV/materiale2016-17/3rd%20Choice/0022.pdf)
```
@inproceedings{nogueira2016towards,
title = {Towards vegetation species discrimination by using data-driven descriptors},
author = {Nogueira, Keiller and Dos Santos, Jefersson A and Fornazari, Tamires and Silva, Thiago Sanna Freire and Morellato, Leonor Patricia and Torres, Ricardo da S},
year = 2016,
booktitle = {2016 9th IAPR Workshop on Pattern Recogniton in Remote Sensing (PRRS)},
pages = {1--6},
organization = {Ieee}
}
``` |
Shuv001/Sengine | ---
license: apache-2.0
---
|
KaiNylund/arxiv-year-splits | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: 2006_2008_train
num_bytes: 100484371
num_examples: 120937
- name: 2006_2008_test
num_bytes: 10050474
num_examples: 12157
- name: 2009_2011_train
num_bytes: 145839572
num_examples: 157401
- name: 2009_2011_test
num_bytes: 15067693
num_examples: 16306
- name: 2012_2014_train
num_bytes: 149239610
num_examples: 153162
- name: 2012_2014_test
num_bytes: 15064105
num_examples: 15440
- name: 2015_2017_train
num_bytes: 150547411
num_examples: 136762
- name: 2015_2017_test
num_bytes: 15057851
num_examples: 13745
- name: 2018_2020_train
num_bytes: 150517629
num_examples: 129279
- name: 2018_2020_test
num_bytes: 15052957
num_examples: 12885
download_size: 474674602
dataset_size: 766921673
---
# Dataset Card for "arxiv-year-splits"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rubensmau/DovTzamir-fragmentos-memoria | ---
license: mit
conteudo: Capitulo do livro Fragmentos de Memória de Abraham Milgrom, redigido por Dov Tzamir
---
|
distilled-one-sec-cv12-each-chunk-uniq/chunk_250 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 883011920.0
num_examples: 172060
download_size: 904700995
dataset_size: 883011920.0
---
# Dataset Card for "chunk_250"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmayhem93/agieval-gaokao-history | ---
dataset_info:
features:
- name: query
dtype: string
- name: choices
sequence: string
- name: gold
sequence: int64
splits:
- name: test
num_bytes: 120008
num_examples: 235
download_size: 78981
dataset_size: 120008
license: mit
---
# Dataset Card for "agieval-gaokao-history"
Dataset taken from https://github.com/microsoft/AGIEval and processed as in that repo.
MIT License
Copyright (c) Microsoft Corporation.
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE
@misc{zhong2023agieval,
title={AGIEval: A Human-Centric Benchmark for Evaluating Foundation Models},
author={Wanjun Zhong and Ruixiang Cui and Yiduo Guo and Yaobo Liang and Shuai Lu and Yanlin Wang and Amin Saied and Weizhu Chen and Nan Duan},
year={2023},
eprint={2304.06364},
archivePrefix={arXiv},
primaryClass={cs.CL}
} |
JAYASWAROOP/trail1 | ---
task_categories:
- question-answering
--- |
hellokitty/accident | ---
license: apache-2.0
---
|
yzhuang/autotree_automl_10000_electricity_sgosdt_l256_dim7_d3_sd0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: input_y_clean
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 205720000
num_examples: 10000
- name: validation
num_bytes: 205720000
num_examples: 10000
download_size: 102866704
dataset_size: 411440000
---
# Dataset Card for "autotree_automl_10000_electricity_sgosdt_l256_dim7_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
firopyomyo/ggggggg | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': conditioning_images
'1': images
splits:
- name: train
num_bytes: 9235.0
num_examples: 2
download_size: 6697
dataset_size: 9235.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
reach-vb/mls-eng-10k-repunct-all | ---
dataset_info:
features:
- name: original_path
dtype: string
- name: begin_time
dtype: float64
- name: end_time
dtype: float64
- name: transcript
dtype: string
- name: audio_duration
dtype: float64
- name: speaker_id
dtype: string
- name: book_id
dtype: string
- name: repunct_text
dtype: string
splits:
- name: dev
num_bytes: 2182587
num_examples: 3807
- name: test
num_bytes: 2168630
num_examples: 3769
download_size: 2442974
dataset_size: 4351217
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
|
Yubing/dogs | ---
license: openrail
---
|
SeyedAli/Persian-Text-Emotion | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 1612793
num_examples: 5558
- name: test
num_bytes: 409414
num_examples: 1390
download_size: 1143196
dataset_size: 2022207
task_categories:
- text-classification
language:
- fa
---
Dataset Classes
* joy:0
* sad:1
* anger:2
* disgust:3
* fear:4
* surprise:5 |
cannlytics/cannabis_tests | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
license:
- cc-by-4.0
pretty_name: cannabis_tests
size_categories:
- 1K<n<10K
source_datasets:
- original
tags:
- cannabis
- lab results
- tests
---
# Cannabis Tests, Curated by Cannlytics
<div style="margin-top:1rem; margin-bottom: 1rem;">
<img width="240px" alt="" src="https://firebasestorage.googleapis.com/v0/b/cannlytics.appspot.com/o/public%2Fimages%2Fdatasets%2Fcannabis_tests%2Fcannabis_tests_curated_by_cannlytics.png?alt=media&token=22e4d1da-6b30-4c3f-9ff7-1954ac2739b2">
</div>
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Data Collection and Normalization](#data-collection-and-normalization)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [License](#license)
- [Citation](#citation)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** <https://github.com/cannlytics/cannlytics>
- **Repository:** <https://huggingface.co/datasets/cannlytics/cannabis_tests>
- **Point of Contact:** <dev@cannlytics.com>
### Dataset Summary
This dataset is a collection of public cannabis lab test results parsed by [`CoADoc`](https://github.com/cannlytics/cannlytics/tree/main/cannlytics/data/coas), a certificate of analysis (COA) parsing tool.
## Dataset Structure
The dataset is partitioned into the various sources of lab results.
| Subset | Source | Observations |
|--------|--------|--------------|
| `rawgarden` | Raw Gardens | 2,667 |
| `mcrlabs` | MCR Labs | Coming soon! |
| `psilabs` | PSI Labs | Coming soon! |
| `sclabs` | SC Labs | Coming soon! |
| `washington` | Washington State | Coming soon! |
### Data Instances
You can load the `details` for each of the dataset files. For example:
```py
from datasets import load_dataset
# Download Raw Garden lab result details.
dataset = load_dataset('cannlytics/cannabis_tests', 'rawgarden')
details = dataset['details']
assert len(details) > 0
print('Downloaded %i observations.' % len(details))
```
> Note: Configurations for `results` and `values` are planned. For now, you can create these data with `CoADoc().save(details, out_file)`.
### Data Fields
Below is a non-exhaustive list of fields, used to standardize the various data that are encountered, that you may expect encounter in the parsed COA data.
| Field | Example| Description |
|-------|-----|-------------|
| `analyses` | ["cannabinoids"] | A list of analyses performed on a given sample. |
| `{analysis}_method` | "HPLC" | The method used for each analysis. |
| `{analysis}_status` | "pass" | The pass, fail, or N/A status for pass / fail analyses. |
| `coa_urls` | [{"url": "", "filename": ""}] | A list of certificate of analysis (CoA) URLs. |
| `date_collected` | 2022-04-20T04:20 | An ISO-formatted time when the sample was collected. |
| `date_tested` | 2022-04-20T16:20 | An ISO-formatted time when the sample was tested. |
| `date_received` | 2022-04-20T12:20 | An ISO-formatted time when the sample was received. |
| `distributor` | "Your Favorite Dispo" | The name of the product distributor, if applicable. |
| `distributor_address` | "Under the Bridge, SF, CA 55555" | The distributor address, if applicable. |
| `distributor_street` | "Under the Bridge" | The distributor street, if applicable. |
| `distributor_city` | "SF" | The distributor city, if applicable. |
| `distributor_state` | "CA" | The distributor state, if applicable. |
| `distributor_zipcode` | "55555" | The distributor zip code, if applicable. |
| `distributor_license_number` | "L2Stat" | The distributor license number, if applicable. |
| `images` | [{"url": "", "filename": ""}] | A list of image URLs for the sample. |
| `lab_results_url` | "https://cannlytics.com/results" | A URL to the sample results online. |
| `producer` | "Grow Tent" | The producer of the sampled product. |
| `producer_address` | "3rd & Army, SF, CA 55555" | The producer's address. |
| `producer_street` | "3rd & Army" | The producer's street. |
| `producer_city` | "SF" | The producer's city. |
| `producer_state` | "CA" | The producer's state. |
| `producer_zipcode` | "55555" | The producer's zipcode. |
| `producer_license_number` | "L2Calc" | The producer's license number. |
| `product_name` | "Blue Rhino Pre-Roll" | The name of the product. |
| `lab_id` | "Sample-0001" | A lab-specific ID for the sample. |
| `product_type` | "flower" | The type of product. |
| `batch_number` | "Order-0001" | A batch number for the sample or product. |
| `metrc_ids` | ["1A4060300002199000003445"] | A list of relevant Metrc IDs. |
| `metrc_lab_id` | "1A4060300002199000003445" | The Metrc ID associated with the lab sample. |
| `metrc_source_id` | "1A4060300002199000003445" | The Metrc ID associated with the sampled product. |
| `product_size` | 2000 | The size of the product in milligrams. |
| `serving_size` | 1000 | An estimated serving size in milligrams. |
| `servings_per_package` | 2 | The number of servings per package. |
| `sample_weight` | 1 | The weight of the product sample in grams. |
| `results` | [{...},...] | A list of results, see below for result-specific fields. |
| `status` | "pass" | The overall pass / fail status for all contaminant screening analyses. |
| `total_cannabinoids` | 14.20 | The analytical total of all cannabinoids measured. |
| `total_thc` | 14.00 | The analytical total of THC and THCA. |
| `total_cbd` | 0.20 | The analytical total of CBD and CBDA. |
| `total_terpenes` | 0.42 | The sum of all terpenes measured. |
| `results_hash` | "{sha256-hash}" | An HMAC of the sample's `results` JSON signed with Cannlytics' public key, `"cannlytics.eth"`. |
| `sample_id` | "{sha256-hash}" | A generated ID to uniquely identify the `producer`, `product_name`, and `results`. |
| `sample_hash` | "{sha256-hash}" | An HMAC of the entire sample JSON signed with Cannlytics' public key, `"cannlytics.eth"`. |
<!-- | `strain_name` | "Blue Rhino" | A strain name, if specified. Otherwise, can be attempted to be parsed from the `product_name`. | -->
Each result can contain the following fields.
| Field | Example| Description |
|-------|--------|-------------|
| `analysis` | "pesticides" | The analysis used to obtain the result. |
| `key` | "pyrethrins" | A standardized key for the result analyte. |
| `name` | "Pyrethrins" | The lab's internal name for the result analyte |
| `value` | 0.42 | The value of the result. |
| `mg_g` | 0.00000042 | The value of the result in milligrams per gram. |
| `units` | "ug/g" | The units for the result `value`, `limit`, `lod`, and `loq`. |
| `limit` | 0.5 | A pass / fail threshold for contaminant screening analyses. |
| `lod` | 0.01 | The limit of detection for the result analyte. Values below the `lod` are typically reported as `ND`. |
| `loq` | 0.1 | The limit of quantification for the result analyte. Values above the `lod` but below the `loq` are typically reported as `<LOQ`. |
| `status` | "pass" | The pass / fail status for contaminant screening analyses. |
### Data Splits
The data is split into `details`, `results`, and `values` data. Configurations for `results` and `values` are planned. For now, you can create these data with:
```py
from cannlytics.data.coas import CoADoc
from datasets import load_dataset
import pandas as pd
# Download Raw Garden lab result details.
repo = 'cannlytics/cannabis_tests'
dataset = load_dataset(repo, 'rawgarden')
details = dataset['details']
# Save the data locally with "Details", "Results", and "Values" worksheets.
outfile = 'details.xlsx'
parser = CoADoc()
parser.save(details.to_pandas(), outfile)
# Read the values.
values = pd.read_excel(outfile, sheet_name='Values')
# Read the results.
results = pd.read_excel(outfile, sheet_name='Results')
```
<!-- Training data is used for training your models. Validation data is used for evaluating your trained models, to help you determine a final model. Test data is used to evaluate your final model. -->
## Dataset Creation
### Curation Rationale
Certificates of analysis (CoAs) are abundant for cannabis cultivators, processors, retailers, and consumers too, but the data is often locked away. Rich, valuable laboratory data so close, yet so far away! CoADoc puts these vital data points in your hands by parsing PDFs and URLs, finding all the data, standardizing the data, and cleanly returning the data to you.
### Source Data
| Data Source | URL |
|-------------|-----|
| MCR Labs Test Results | <https://reports.mcrlabs.com> |
| PSI Labs Test Results | <https://results.psilabs.org/test-results/> |
| Raw Garden Test Results | <https://rawgarden.farm/lab-results/> |
| SC Labs Test Results | <https://client.sclabs.com/> |
| Washington State Lab Test Results | <https://lcb.app.box.com/s/e89t59s0yb558tjoncjsid710oirqbgd> |
#### Data Collection and Normalization
You can recreate the dataset using the open source algorithms in the repository. First clone the repository:
```
git clone https://huggingface.co/datasets/cannlytics/cannabis_tests
```
You can then install the algorithm Python (3.9+) requirements:
```
cd cannabis_tests
pip install -r requirements.txt
```
Then you can run all of the data-collection algorithms:
```
python algorithms/main.py
```
Or you can run each algorithm individually. For example:
```
python algorithms/get_results_mcrlabs.py
```
In the `algorithms` directory, you can find the data collection scripts described in the table below.
| Algorithm | Organization | Description |
|-----------|---------------|-------------|
| `get_results_mcrlabs.py` | MCR Labs | Get lab results published by MCR Labs. |
| `get_results_psilabs.py` | PSI Labs | Get historic lab results published by MCR Labs. |
| `get_results_rawgarden.py` | Raw Garden | Get lab results Raw Garden publishes for their products. |
| `get_results_sclabs.py` | SC Labs | Get lab results published by SC Labs. |
| `get_results_washington.py` | Washington State | Get historic lab results obtained through a FOIA request in Washington State. |
### Personal and Sensitive Information
The dataset includes public addresses and contact information for related cannabis licensees. It is important to take care to use these data points in a legal manner.
## Considerations for Using the Data
### Social Impact of Dataset
Arguably, there is substantial social impact that could result from the study of cannabis, therefore, researchers and data consumers alike should take the utmost care in the use of this dataset.
### Discussion of Biases
Cannlytics is a for-profit data and analytics company that primarily serves cannabis businesses. The data are not randomly collected and thus sampling bias should be taken into consideration.
### Other Known Limitations
The data represents only a subset of the population of cannabis lab results. Non-standard values are coded as follows.
| Actual | Coding |
|--------|--------|
| `'ND'` | `0.000000001` |
| `'No detection in 1 gram'` | `0.000000001` |
| `'Negative/1g'` | `0.000000001` |
| '`PASS'` | `0.000000001` |
| `'<LOD'` | `0.00000001` |
| `'< LOD'` | `0.00000001` |
| `'<LOQ'` | `0.0000001` |
| `'< LOQ'` | `0.0000001` |
| `'<LLOQ'` | `0.0000001` |
| `'≥ LOD'` | `10001` |
| `'NR'` | `None` |
| `'N/A'` | `None` |
| `'na'` | `None` |
| `'NT'` | `None` |
## Additional Information
### Dataset Curators
Curated by [🔥Cannlytics](https://cannlytics.com)<br>
<dev@cannlytics.com>
### License
```
Copyright (c) 2022 Cannlytics and the Cannabis Data Science Team
The files associated with this dataset are licensed under a
Creative Commons Attribution 4.0 International license.
You can share, copy and modify this dataset so long as you give
appropriate credit, provide a link to the CC BY license, and
indicate if changes were made, but you may not do so in a way
that suggests the rights holder has endorsed you or your use of
the dataset. Note that further permission may be required for
any content within the dataset that is identified as belonging
to a third party.
```
### Citation
Please cite the following if you use the code examples in your research:
```bibtex
@misc{cannlytics2022,
title={Cannabis Data Science},
author={Skeate, Keegan and O'Sullivan-Sutherland, Candace},
journal={https://github.com/cannlytics/cannabis-data-science},
year={2022}
}
```
### Contributions
Thanks to [🔥Cannlytics](https://cannlytics.com), [@candy-o](https://github.com/candy-o), [@hcadeaux](https://huggingface.co/hcadeaux), [@keeganskeate](https://github.com/keeganskeate), [The CESC](https://thecesc.org), and the entire [Cannabis Data Science Team](https://meetup.com/cannabis-data-science/members) for their contributions.
|
niwator1/required_subject_1 | ---
license: apache-2.0
---
|
CyberHarem/fuxi_honkai3 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of fuxi (Houkai 3rd)
This is the dataset of fuxi (Houkai 3rd), containing 12 images and their tags.
The core tags of this character are `bangs, blue_eyes, long_hair, hair_ornament, black_hair, hair_bun, very_long_hair, blunt_bangs, braid, brown_hair, breasts, double_bun, multicolored_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 12 | 13.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fuxi_honkai3/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 12 | 6.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fuxi_honkai3/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 24 | 13.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fuxi_honkai3/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 12 | 11.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fuxi_honkai3/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 24 | 20.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fuxi_honkai3/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/fuxi_honkai3',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, solo, bare_shoulders, looking_at_viewer, collarbone, detached_sleeves, long_sleeves, snake, white_dress, barefoot, parted_lips, sitting, sleeves_past_wrists, strapless, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | bare_shoulders | looking_at_viewer | collarbone | detached_sleeves | long_sleeves | snake | white_dress | barefoot | parted_lips | sitting | sleeves_past_wrists | strapless | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------------|:--------------------|:-------------|:-------------------|:---------------|:--------|:--------------|:-----------|:--------------|:----------|:----------------------|:------------|:-------------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
sethapun/arithmetic_2all_1to100 | ---
dataset_info:
features:
- name: expression
dtype: string
- name: answer
dtype: float64
- name: label
dtype:
class_label:
names:
'0': 'false'
'1': 'true'
splits:
- name: train
num_bytes: 57780
num_examples: 2000
- name: validation
num_bytes: 11516
num_examples: 400
download_size: 26097
dataset_size: 69296
---
# Dataset Card for "arithmetic_2all_1to100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
qgallouedec/prj_gia_dataset_metaworld_disassemble_v2_1111 | ---
library_name: gia
tags:
- deep-reinforcement-learning
- reinforcement-learning
- gia
- multi-task
- multi-modal
- imitation-learning
- offline-reinforcement-learning
---
An imitation learning environment for the disassemble-v2 environment, sample for the policy disassemble-v2
This environment was created as part of the Generally Intelligent Agents project gia: https://github.com/huggingface/gia
## Load dataset
First, clone it with
```sh
git clone https://huggingface.co/datasets/qgallouedec/prj_gia_dataset_metaworld_disassemble_v2_1111
```
Then, load it with
```python
import numpy as np
dataset = np.load("prj_gia_dataset_metaworld_disassemble_v2_1111/dataset.npy", allow_pickle=True).item()
print(dataset.keys()) # dict_keys(['observations', 'actions', 'dones', 'rewards'])
```
|
SuodhanJ6/train | ---
license: mit
---
|
Arindam0231/Adult-Income-Alpaca | ---
dataset_info:
features:
- name: output
dtype: string
- name: input
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 18190039
num_examples: 48842
download_size: 478121
dataset_size: 18190039
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
FER-Universe/DiffusionFER | ---
layout: default
title: Home
nav_order: 1
has_children: false
annotations_creators:
- no-annotation
language:
- en
language_creators:
- found
pretty_name: DiffusionFER
size_categories:
- n<500MB
source_datasets:
- original
license: cc0-1.0
tags:
- stable diffusion
- prompt engineering
- prompts
- research paper
- facial expression recognition
- emotion recognition
task_categories:
- text-to-image
task_ids:
- image-captioning
- face-detection
---
## Dataset Description
- **Homepage:** [DiffusionFER homepage](https://kdhht2334.github.io/)
- **Repository:** [DiffusionFER repository](https://github.com/kdhht2334/Facial-Expression-Recognition-Zoo)
- **Distribution:** [DiffusionFER Hugging Face Dataset](https://huggingface.co/datasets/FER-Universe/DiffusionFER)
- **Point of Contact:** [Daeha Kim](mailto:kdhht5022@gmail.com)
### Summary
DiffusionFER is the large-scale text-to-image prompt database for face-related tasks. It contains about **1M(ongoing)** images generated by [Stable Diffusion](https://github.com/camenduru/stable-diffusion-webui-colab) using prompt(s) and other parameters.
DiffusionFER is available at [🤗 Hugging Face Dataset](https://huggingface.co/datasets/FER-Universe/DiffusionFER).
### Downstream Tasks and Leaderboards
This DiffusionFER dataset can be utilized for the following downstream tasks.
- Face detection
- Facial expression recognition
- Text-to-emotion prompting
In addition, the virtual subjects included in this dataset provide opportunities to perform various vision tasks related to face privacy.
### Data Loading
DiffusionFER can be loaded via both Python and Git. Please refer Hugging Face [`Datasets`](https://huggingface.co/docs/datasets/quickstart).
```python
from datasets import load_dataset
dataset = load_dataset("FER-Universe/DiffusionFER")
```
```bash
git lfs install
git clone https://huggingface.co/datasets/FER-Universe/DiffusionFER
```
### Pre-trained model
You can easily download and use pre-trained __Swin Transformer__ model with the `Diffusion_Emotion_S` dataset.
Later, Transformer models with the `Diffusion_Emotion_M` or `Diffusion_Emotion_L` will be released.
```python
from transformers import AutoFeatureExtractor, AutoModelForImageClassification
extractor = AutoFeatureExtractor.from_pretrained("kdhht2334/autotrain-diffusion-emotion-facial-expression-recognition-40429105176")
model = AutoModelForImageClassification.from_pretrained("kdhht2334/autotrain-diffusion-emotion-facial-expression-recognition-40429105176")
```
Or just clone the model repo
```bash
git lfs install
git clone https://huggingface.co/kdhht2334/autotrain-diffusion-emotion-facial-expression-recognition-40429105176
```
- Quick links: [huggingface model documentation](https://huggingface.co/docs/transformers/main/en/model_doc/swin#transformers.SwinForImageClassification)
### Sample Gallery
▼Happy

▼Angry

### Subsets
DiffusionFER supports a total of three distinct splits. And, each split additionally provides a face region cropped by [face detector](https://github.com/timesler/facenet-pytorch).
- DifussionEmotion_S (small), DifussionEmotion_M (medium), DifussionEmotion_L (large).
|Subset|Num of Images|Size|Image Directory |
|:--|--:|--:|--:|
|DifussionEmotion_S (original) | 1.5K | 647M | `DifussionEmotion_S/` |
|DifussionEmotion_S (cropped) | 1.5K | 322M | `DiffusionEmotion_S_cropped/` |
|DifussionEmotion_M (original) | N/A | N/A | `DifussionEmotion_M/` |
|DifussionEmotion_M (cropped) | N/A | N/A | `DiffusionEmotion_M_cropped/` |
|DifussionEmotion_L (original) | N/A | N/A | `DifussionEmotion_L/` |
|DifussionEmotion_L (cropped) | N/A | N/A | `DiffusionEmotion_L_cropped/` |
## Dataset Structure
We provide DiffusionFER using a modular file structure. `DiffusionEmotion_S`, the smallest scale, contains about 1,500 images and is divided into folders of a total of 7 emotion classes. The class labels of all these images are included in `dataset_sheet.csv`.
- In `dataset_sheet.csv`, not only 7-emotion class but also _valence-arousal_ value are annotated.
```bash
# Small version of DB
./
├── DifussionEmotion_S
│ ├── angry
│ │ ├── aaaaaaaa_6.png
│ │ ├── andtcvhp_6.png
│ │ ├── azikakjh_6.png
│ │ ├── [...]
│ ├── fear
│ ├── happy
│ ├── [...]
│ └── surprise
└── dataset_sheet.csv
```
- Middle size DB will be uploaded soon.
```bash
# Medium version of DB
(ongoing)
```
- TBD
```bash
# Large version of DB
(ongoing)
```
### Prompt Format
Basic format is as follows: "`Emotion`, `Race` `Age` style, a realistic portrait of `Style` `Gender`, upper body, `Others`".
- ex) one person, neutral emotion, white middle-aged style, a realistic portrait of man, upper body
Examples of format categories are listed in the table below.
| Category | Prompt(s) |
| --- | --- |
| `Emotion` | neutral emotion<br>happy emotion, with open mouth, smiley<br>sad emotion, with tears, lowered head, droopy eyebrows<br>surprise emotion, with open mouth, big eyes<br>fear emotion, scared, haunted<br>disgust emotion, frown, angry expression with open mouth<br>angry emotion, with open mouth, frown eyebrow, fierce, furious |
| `Race` | white<br>black<br>latin |
| `Age` | teen<br>middle-aged<br>old |
| `Gender` | man<br>woman |
| `Style` | gentle<br>handsome<br>pretty<br>cute<br>mature<br>punky<br>freckles<br>beautiful crystal eyes<br>big eyes<br>small nose<br>... |
| `Others` | 4K<br>8K<br>cyberpunk<br>camping<br>ancient<br>medieval Europe<br>... |
### Prompt Engineering
You can improve the performance and quality of generating default prompts with the settings below.
```
{
"negative prompt": "sketches, (worst quality:2), (low quality:2), (normal quality:2), lowres, normal quality, ((monochrome)), ((grayscale)), skin spots, acnes, skin blemishes, backlight, (duplicate:1.331), (morbid:1.21), (mutilated:1.21), mutated hands, (poorly drawn hands:1.331), (bad anatomy:1.21), (bad proportions:1.331), extra limbs, (disfigured:1.331), (missing arms:1.331), (extra legs:1.331), (fused fingers:1.61051), (too many fingers:1.61051), (unclear eyes:1.331), bad hands, missing fingers, extra digit",
"steps": 50,
"sampling method": "DPM++ 2M Karras"
"Width": "512",
"Height": "512",
"CFG scale": 12.0,
"seed": -1,
}
```
### Annotations
The DiffusionFER contains annotation process both 7-emotion classes and valence-arousal values.
#### Annotation process
This process was carried out inspired by the theory of the two research papers below.
- JA Russell, [A circumplex model of affect](https://d1wqtxts1xzle7.cloudfront.net/38425675/Russell1980-libre.pdf?1439132613=&response-content-disposition=inline%3B+filename%3DRussell1980.pdf&Expires=1678595455&Signature=UtbPsezND6w8vbISBiuL-ECk6hDI0etLcJSE7kJMC~hAkMSu9YyQcPKdVpdHSSq7idfcQ~eEKsqptvYpy0199DX0gi-nHJwhsciahC-zgDwylEUo6ykhP6Ab8VWCOW-DM21jHNvbYLQf7Pwi66fGvm~5bAXPc1o4HHpQpk-Cr7b0tW9lYnl3qgLoVeIICg6FLu0elbtVztgH5OS1uL6V~QhiP2PCwZf~WCHuJRQrWdPt5Kuco0lsNr1Qikk1~d7HY3ZcUTRZcMNDdem8XAFDH~ak3QER6Ml~JDkNFcLuygz~tjL4CdScVhByeAuMe3juyijtBFtYWH2h30iRkUDalg__&Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA)
- A Mollahosseini et al., [AffectNet](https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=8013713&casa_token=C3QmhmiB6Y8AAAAA:1CiUll0bhIq06M17YwFIvxuse7GOosEN9G1A8vxVzR8Vb5eaFp6ERIjg7xhSIQlf008KLsfJ-w&tag=1)
#### Who are the annotators?
[Daeha Kim](mailto:kdhht5022@gmail.com) and [Dohee Kang](mailto:asrs777@naver.com)
## Additional Information
### Dataset Curators
DiffusionFER is created by [Daeha Kim](https://kdhht2334.github.io/) and [Dohee Kang](https://github.com/KangDohee2270).
### Acknowledgments
This repository is heavily inspired by [DiffusionDB](https://huggingface.co/datasets/poloclub/diffusiondb), with some format references. Thank you for your interest in [DiffusionDB](https://huggingface.co/datasets/poloclub/diffusiondb).
### Licensing Information
The DiffusionFER is available under the [CC0 1.0 License](https://creativecommons.org/publicdomain/zero/1.0/). NOTE: The primary purpose of this dataset is research. We are not responsible if you take any other action using this dataset.
### Contributions
If you have any questions, feel free to [open an issue](https://github.com/kdhht2334/Facial-Expression-Recognition-Zoo/issues/new) or contact [Daeha Kim](https://kdhht2334.github.io/). |
moyoweke/nollydata | ---
license: apache-2.0
---
|
Falah/ali_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 367063
num_examples: 1000
download_size: 19378
dataset_size: 367063
---
# Dataset Card for "ali_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
albertvillanova/vbsv-dataset | ---
configs:
- config_name: default
data_files: "dataset.csv"
sep: "|"
--- |
liuyanchen1015/MULTI_VALUE_mnli_analytic_whose_relativizer | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 5186
num_examples: 16
- name: dev_mismatched
num_bytes: 9084
num_examples: 30
- name: test_matched
num_bytes: 7546
num_examples: 30
- name: test_mismatched
num_bytes: 8310
num_examples: 33
- name: train
num_bytes: 261182
num_examples: 928
download_size: 140498
dataset_size: 291308
---
# Dataset Card for "MULTI_VALUE_mnli_analytic_whose_relativizer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/classification_arabic_dialects | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: label
dtype:
class_label:
names:
'0': Algeria
'1': Egypt
'2': Iraq
'3': Jordan
'4': Morocco
'5': Saudi_Arabia
'6': Sudan
'7': Syria
'8': Tunisia
'9': Yemen
splits:
- name: train
num_bytes: 166407297.0
num_examples: 130
download_size: 158117904
dataset_size: 166407297.0
---
# Classification of Arabic Dialects Audio Dataset
This dataset contains audio samples of various Arabic dialects for the task of classification and recognition. The dataset aims to assist researchers and practitioners in developing models and systems for Arabic spoken language analysis and understanding.
## Dataset Details
- Dataset Name: Classification of Arabic Dialects Audio Dataset
- Dataset URL: [Falah/classification_arabic_dialects](https://huggingface.co/datasets/Falah/classification_arabic_dialects)
- Dataset Size: 166,407,297 bytes
- Download Size: 158,117,904 bytes
- Splits:
- Train: 130 examples
## Class Labels and Mapping
The dataset consists of audio samples from the following Arabic dialects, along with their corresponding class labels:
- '0': Algeria
- '1': Egypt
- '2': Iraq
- '3': Jordan
- '4': Morocco
- '5': Saudi Arabia
- '6': Sudan
- '7': Syria
- '8': Tunisia
- '9': Yemen
Please refer to the dataset for the audio samples and their respective class labels.
## Usage Example
To play and display an audio sample from the dataset, you can use the following code:
```python
from IPython.display import Audio
country_names = ['Algeria', 'Egypt', 'Iraq', 'Jordan', 'Morocco', 'Saudi_Arabia', 'Sudan', 'Syria', 'Tunisia', 'Yemen']
index = 0 # Index of the audio example
label = dataset["train"][index]["label"]
country_name = country_names[int(label)]
audio_data = dataset["train"][index]["audio"]["array"]
sampling_rate = dataset["train"][index]["audio"]["sampling_rate"]
# Play audio
display(Audio(audio_data, rate=sampling_rate))
print("Class Label:", label)
print("Country Name:", country_name)
```
Make sure to replace `index` with the desired index of the audio example. This code will play the audio, display it, and print its associated class label and the matched country name from the `country_names` list.
## Applications
The Classification of Arabic Dialects Audio Dataset can be utilized in various applications, including but not limited to:
- Arabic dialect classification
- Arabic spoken language recognition
- Speech analysis and understanding for Arabic dialects
- Acoustic modeling for Arabic dialects
- Cross-dialect speech processing and synthesis
Feel free to explore and leverage this dataset for your research and development tasks related to Arabic spoken language analysis and recognition.
## License
The dataset is made available under the terms of the [Creative Commons Attribution-ShareAlike 4.0 International (CC BY-SA 4.0)](https://creativecommons.org/licenses/by-sa/4.0/) license.
## Citation
If you use this dataset in your research or any other work, please consider citing it as
For more information or inquiries about the dataset, please contact the dataset author(s) mentioned in the citation.
```
@dataset{classification_arabic_dialects,
author = {Falah.G.Salieh},
title = {Classification of Arabic Dialects Audio Dataset},
year = {2023},
publisher = {Hugging Face},
url = {https://huggingface.co/datasets/Falah/classification_arabic_dialects},
}
``` |
cjensen/celeb-identities | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Carrot_Top
'1': Chris_Hemsworth
'2': Gru
'3': Michael_Jordan
'4': Mother_Teresa
'5': Winona_Ryder
splits:
- name: train
num_bytes: 8636520.0
num_examples: 18
download_size: 8635182
dataset_size: 8636520.0
---
# Dataset Card for "celeb-identities"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-one-sec-cv12/chunk_111 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1173232444
num_examples: 230407
download_size: 1180373791
dataset_size: 1173232444
---
# Dataset Card for "chunk_111"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jbrazzy/baby_names | ---
dataset_info:
features:
- name: Names
dtype: string
- name: Sex
dtype: string
- name: Count
dtype: int64
- name: Year
dtype: int64
splits:
- name: train
num_bytes: 33860482
num_examples: 1084385
- name: test
num_bytes: 8482889
num_examples: 271663
download_size: 13301020
dataset_size: 42343371
---
# Dataset Card for "baby_names"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AppleHarem/rockrock_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of rockrock (Arknights)
This is the dataset of rockrock (Arknights), containing 48 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
This is a WebUI contains crawlers and other thing: ([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 48 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 132 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 141 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 48 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 48 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 48 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 132 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 132 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 121 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 141 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 141 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
Vitorbr2009/voz-afauna-treinada | ---
license: openrail
---
|
premai-io/sd-ml-assignment | ---
license: mit
task_categories:
- text-to-image
---
# Text to Image Dataset for Pixel Art style
## Dataset Description
The dataset contains 100 examples of Images representing different topics all with the same style. |
severo/doc-image-audio-1 | ---
size_categories:
- n<1K
---
# [doc] image + audio dataset 1
This dataset contains 4 jpeg image files and 4 wav audio files at the root.
|
open-llm-leaderboard/details_bofenghuang__vigogne-33b-instruct | ---
pretty_name: Evaluation run of bofenghuang/vigogne-33b-instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bofenghuang/vigogne-33b-instruct](https://huggingface.co/bofenghuang/vigogne-33b-instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bofenghuang__vigogne-33b-instruct\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-17T06:48:17.282592](https://huggingface.co/datasets/open-llm-leaderboard/details_bofenghuang__vigogne-33b-instruct/blob/main/results_2023-10-17T06-48-17.282592.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.4092911073825503,\n\
\ \"em_stderr\": 0.005035499534676373,\n \"f1\": 0.47988779362416334,\n\
\ \"f1_stderr\": 0.004806379711128169,\n \"acc\": 0.4499623916853611,\n\
\ \"acc_stderr\": 0.010072884519008809\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.4092911073825503,\n \"em_stderr\": 0.005035499534676373,\n\
\ \"f1\": 0.47988779362416334,\n \"f1_stderr\": 0.004806379711128169\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11144806671721001,\n \
\ \"acc_stderr\": 0.008668021353794433\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7884767166535123,\n \"acc_stderr\": 0.011477747684223187\n\
\ }\n}\n```"
repo_url: https://huggingface.co/bofenghuang/vigogne-33b-instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_17T06_48_17.282592
path:
- '**/details_harness|drop|3_2023-10-17T06-48-17.282592.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-17T06-48-17.282592.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_17T06_48_17.282592
path:
- '**/details_harness|gsm8k|5_2023-10-17T06-48-17.282592.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-17T06-48-17.282592.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_17T06_48_17.282592
path:
- '**/details_harness|winogrande|5_2023-10-17T06-48-17.282592.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-17T06-48-17.282592.parquet'
- config_name: results
data_files:
- split: 2023_10_17T06_48_17.282592
path:
- results_2023-10-17T06-48-17.282592.parquet
- split: latest
path:
- results_2023-10-17T06-48-17.282592.parquet
---
# Dataset Card for Evaluation run of bofenghuang/vigogne-33b-instruct
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/bofenghuang/vigogne-33b-instruct
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [bofenghuang/vigogne-33b-instruct](https://huggingface.co/bofenghuang/vigogne-33b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bofenghuang__vigogne-33b-instruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T06:48:17.282592](https://huggingface.co/datasets/open-llm-leaderboard/details_bofenghuang__vigogne-33b-instruct/blob/main/results_2023-10-17T06-48-17.282592.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.4092911073825503,
"em_stderr": 0.005035499534676373,
"f1": 0.47988779362416334,
"f1_stderr": 0.004806379711128169,
"acc": 0.4499623916853611,
"acc_stderr": 0.010072884519008809
},
"harness|drop|3": {
"em": 0.4092911073825503,
"em_stderr": 0.005035499534676373,
"f1": 0.47988779362416334,
"f1_stderr": 0.004806379711128169
},
"harness|gsm8k|5": {
"acc": 0.11144806671721001,
"acc_stderr": 0.008668021353794433
},
"harness|winogrande|5": {
"acc": 0.7884767166535123,
"acc_stderr": 0.011477747684223187
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
sriramahesh2000/GenetarionDataset | ---
license: apache-2.0
--- |
PragueMan/beautifuldata | ---
license: apache-2.0
---
|
autoevaluate/autoeval-eval-big_patent-y-b4cccf-1519855005 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- big_patent
eval_info:
task: summarization
model: pszemraj/pegasus-x-large-book-summary
metrics: []
dataset_name: big_patent
dataset_config: y
dataset_split: test
col_mapping:
text: description
target: abstract
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/pegasus-x-large-book-summary
* Dataset: big_patent
* Config: y
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
MihaiIonascu/Azure_IaC_test | ---
license: apache-2.0
---
|
batzorigco/autotrain-data-autotrain-gg2pj-co58q | ---
license: apache-2.0
dataset_info:
features:
- name: autotrain_text
dtype: string
- name: autotrain_label
dtype:
class_label:
names:
'0': ham
'1': spam
splits:
- name: train
num_bytes: 5535384
num_examples: 16278
- name: validation
num_bytes: 1359462
num_examples: 4070
download_size: 4270927
dataset_size: 6894846
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
BangumiBase/shinmaimaounotestament | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Shinmai Maou No Testament
This is the image base of bangumi Shinmai Maou no Testament, we detected 35 characters, 3166 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 811 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 58 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 67 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 24 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 49 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 14 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 19 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 6 | [Download](7/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 8 | 9 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 11 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 31 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 58 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 120 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 97 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 22 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 11 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 43 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 27 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 14 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 541 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 12 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 11 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 9 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 20 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 9 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 6 | [Download](25/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 26 | 7 | [Download](26/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 27 | 348 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 26 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 43 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 11 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 40 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 15 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 12 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 565 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
Yeerchiu/mmm_lmd_8bars | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2877881612
num_examples: 162599
download_size: 464647020
dataset_size: 2877881612
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
juancavallotti/multilingual-gec | ---
author: Juan Alberto López Cavallotti
date: Jan 6, 2023
license: apache-2.0
task_categories:
- translation
language:
- en
- es
- fr
- de
tags:
- grammar
- gec
- multi language
- language detection
pretty_name: Multi Lingual Grammar Error Correction Dataset
size_categories:
- 100K<n<1M
---
# Dataset Card for Multilingual Grammar Error Correction
## Dataset Description
- **Homepage:** https://juancavallotti.com
- **Paper:** https://blog.juancavallotti.com/2023/01/06/training-a-multi-language-grammar-error-correction-system/
- **Point of Contact:** Juan Alberto López Cavallotti
### Dataset Summary
This dataset can be used to train a transformer model (we used T5) to correct grammar errors in simple sentences written in English, Spanish, French, or German.
This dataset was developed as a component for the [Squidigies](https://squidgies.app/) platform.
### Supported Tasks and Leaderboards
* **Grammar Error Correction:** By appending the prefix *fix grammar:* to the prrompt.
* **Language Detection:** By appending the prefix: *language:* to the prompt.
### Languages
* English
* Spanish
* French
* German
## Dataset Structure
### Data Instances
The dataset contains the following instances for each language:
* German 32282 sentences.
* English 51393 sentences.
* Spanish 67672 sentences.
* French 67157 sentences.
### Data Fields
* `lang`: The language of the sentence
* `sentence`: The original sentence.
* `modified`: The corrupted sentence.
* `transformation`: The primary transformation used by the synthetic data generator.
* `sec_transformation`: The secondary transformation (if any) used by the synthetic data generator.
### Data Splits
* `train`: There isn't a specific split defined. I recommend using 1k sentences sampled randomly from each language, combined with the SacreBleu metric.
## Dataset Creation
### Curation Rationale
This dataset was generated synthetically through code with the help of information of common grammar errors harvested throughout the internet.
### Source Data
#### Initial Data Collection and Normalization
The source grammatical sentences come from various open-source datasets, such as Tatoeba.
#### Who are the source language producers?
* Juan Alberto López Cavallotti
### Annotations
#### Annotation process
The annotation is automatic and produced by the generation script.
#### Who are the annotators?
* Data generation script by Juan Alberto López Cavallotti
### Other Known Limitations
The dataset doesn't cover all the possible grammar errors but serves as a starting point that generates fair results.
## Additional Information
### Dataset Curators
* Juan Alberto López Cavallotti
### Licensing Information
This dataset is distributed under the [Apache 2 License](https://www.apache.org/licenses/LICENSE-2.0)
### Citation Information
Please mention this original dataset and the author **Juan Alberto López Cavallotti**
### Contributions
* Juan Alberto López Cavallotti |
dash8x/dv-presidential-speech | ---
license: apache-2.0
task_categories:
- automatic-speech-recognition
- text-to-speech
language:
- dv
tags:
- audio
- dhivehi
- yag
- speech
- president
- political
size_categories:
- 1K<n<10K
---
# Dataset Card for Dhivehi Presidential Speech 1.0
### Dataset Summary
Dhivehi Presidential Speech is a Dhivehi speech dataset created from data extracted and processed by [Sofwath](https://github.com/Sofwath) as part of a collection of Dhivehi datasets found [here](https://github.com/Sofwath/DhivehiDatasets).
The dataset contains around 2.5 hrs (1 GB) of speech collected from Maldives President's Office consisting of 7 speeches given by President Yaameen Abdhul Gayyoom.
### Supported Tasks and Leaderboards
- Automatic Speech Recognition
- Text-to-Speech
### Languages
Dhivehi
## Dataset Structure
### Data Instances
A typical data point comprises the path to the audio file and its sentence.
```json
{
'path': 'dv-presidential-speech-train/waves/YAG2_77.wav',
'sentence': 'އަދި އަޅުގަނޑުމެންގެ ސަރަޙައްދުގައިވެސް މިކަހަލަ ބޭބޭފުޅުން',
'audio': {
'path': 'dv-presidential-speech-train/waves/YAG2_77.wav',
'array': array([-0.00048828, -0.00018311, -0.00137329, ..., 0.00079346, 0.00091553, 0.00085449], dtype=float32),
'sampling_rate': 16000
},
}
```
### Data Fields
- path (string): The path to the audio file.
- sentence (string): The transcription for the audio file.
- audio (dict): A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: dataset[0]["audio"] the audio file is automatically decoded and resampled to dataset.features["audio"].sampling_rate. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the "audio" column, i.e. dataset[0]["audio"] should always be preferred over dataset["audio"][0].
### Data Splits
The speech material has been subdivided into portions for train, test and validation. The test clips were generated from a speech not in the train split. For the validation split, there is a slight overlap of 1 speech in the train set.
| | Train | Validation | Test |
| ---------------- | -------- | ---------- | ----- |
| Speakers | 1 | 1 | 1 |
| Utterances | 1612 | 200 | 200 |
| Duration | 02:14:59 | 17:02 | 13:30 |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
Extracted and processed by [Sofwath](https://github.com/Sofwath) as part of a collection of Dhivehi datasets found [here](https://github.com/Sofwath/DhivehiDatasets).
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
darksam/socialmedia-abuse | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': neg
'1': pos
splits:
- name: train
num_bytes: 1074806
num_examples: 8530
download_size: 698844
dataset_size: 1074806
---
# Dataset Card for "socialmedia-abuse"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.4-preview2 | ---
pretty_name: Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.4-preview2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [WebraftAI/synapsellm-7b-mistral-v0.4-preview2](https://huggingface.co/WebraftAI/synapsellm-7b-mistral-v0.4-preview2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.4-preview2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-09T19:57:57.872670](https://huggingface.co/datasets/open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.4-preview2/blob/main/results_2023-12-09T19-57-57.872670.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5440235971329553,\n\
\ \"acc_stderr\": 0.03410726380039453,\n \"acc_norm\": 0.5490928177495088,\n\
\ \"acc_norm_stderr\": 0.03483965758622219,\n \"mc1\": 0.37821297429620565,\n\
\ \"mc1_stderr\": 0.016976335907546866,\n \"mc2\": 0.5379290576758808,\n\
\ \"mc2_stderr\": 0.01514579551273296\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5093856655290102,\n \"acc_stderr\": 0.014608816322065,\n\
\ \"acc_norm\": 0.5298634812286689,\n \"acc_norm_stderr\": 0.014585305840007105\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5582553276239793,\n\
\ \"acc_stderr\": 0.004955798214513426,\n \"acc_norm\": 0.7453694483170683,\n\
\ \"acc_norm_stderr\": 0.004347629889040944\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n\
\ \"acc_stderr\": 0.042667634040995814,\n \"acc_norm\": 0.4222222222222222,\n\
\ \"acc_norm_stderr\": 0.042667634040995814\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.040463368839782514,\n\
\ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.040463368839782514\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03015113445777629,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03015113445777629\n },\n\
\ \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.04122728707651282,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.04122728707651282\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.451063829787234,\n \"acc_stderr\": 0.03252909619613197,\n\
\ \"acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.03252909619613197\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\
\ \"acc_stderr\": 0.045796394220704334,\n \"acc_norm\": 0.38596491228070173,\n\
\ \"acc_norm_stderr\": 0.045796394220704334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37037037037037035,\n \"acc_stderr\": 0.0248708152510571,\n \"\
acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.0248708152510571\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6419354838709678,\n\
\ \"acc_stderr\": 0.02727389059430064,\n \"acc_norm\": 0.6419354838709678,\n\
\ \"acc_norm_stderr\": 0.02727389059430064\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.37438423645320196,\n \"acc_stderr\": 0.03405155380561952,\n\
\ \"acc_norm\": 0.37438423645320196,\n \"acc_norm_stderr\": 0.03405155380561952\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.036810508691615486,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.036810508691615486\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7150259067357513,\n \"acc_stderr\": 0.03257714077709662,\n\
\ \"acc_norm\": 0.7150259067357513,\n \"acc_norm_stderr\": 0.03257714077709662\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.517948717948718,\n \"acc_stderr\": 0.025334667080954915,\n \
\ \"acc_norm\": 0.517948717948718,\n \"acc_norm_stderr\": 0.025334667080954915\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5462184873949579,\n \"acc_stderr\": 0.032339434681820885,\n\
\ \"acc_norm\": 0.5462184873949579,\n \"acc_norm_stderr\": 0.032339434681820885\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7137614678899082,\n \"acc_stderr\": 0.01937943662891999,\n \"\
acc_norm\": 0.7137614678899082,\n \"acc_norm_stderr\": 0.01937943662891999\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044811,\n \"\
acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044811\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6911764705882353,\n \"acc_stderr\": 0.03242661719827218,\n \"\
acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.03242661719827218\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.70042194092827,\n \"acc_stderr\": 0.029818024749753088,\n \
\ \"acc_norm\": 0.70042194092827,\n \"acc_norm_stderr\": 0.029818024749753088\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n\
\ \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.6322869955156951,\n\
\ \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6694214876033058,\n \"acc_stderr\": 0.04294340845212093,\n \"\
acc_norm\": 0.6694214876033058,\n \"acc_norm_stderr\": 0.04294340845212093\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n\
\ \"acc_stderr\": 0.045245960070300476,\n \"acc_norm\": 0.6759259259259259,\n\
\ \"acc_norm_stderr\": 0.045245960070300476\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n\
\ \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973647,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973647\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.046561471100123514,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.046561471100123514\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7420178799489144,\n\
\ \"acc_stderr\": 0.01564583018834895,\n \"acc_norm\": 0.7420178799489144,\n\
\ \"acc_norm_stderr\": 0.01564583018834895\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5751445086705202,\n \"acc_stderr\": 0.026613350840261743,\n\
\ \"acc_norm\": 0.5751445086705202,\n \"acc_norm_stderr\": 0.026613350840261743\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.20446927374301677,\n\
\ \"acc_stderr\": 0.013488813404711903,\n \"acc_norm\": 0.20446927374301677,\n\
\ \"acc_norm_stderr\": 0.013488813404711903\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.02782610930728369,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.02782610930728369\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6045016077170418,\n\
\ \"acc_stderr\": 0.02777091853142784,\n \"acc_norm\": 0.6045016077170418,\n\
\ \"acc_norm_stderr\": 0.02777091853142784\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5987654320987654,\n \"acc_stderr\": 0.0272725828498398,\n\
\ \"acc_norm\": 0.5987654320987654,\n \"acc_norm_stderr\": 0.0272725828498398\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3723404255319149,\n \"acc_stderr\": 0.028838921471251458,\n \
\ \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.028838921471251458\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39374185136897,\n\
\ \"acc_stderr\": 0.012478532272564447,\n \"acc_norm\": 0.39374185136897,\n\
\ \"acc_norm_stderr\": 0.012478532272564447\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5772058823529411,\n \"acc_stderr\": 0.03000856284500348,\n\
\ \"acc_norm\": 0.5772058823529411,\n \"acc_norm_stderr\": 0.03000856284500348\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5147058823529411,\n \"acc_stderr\": 0.020219083895133924,\n \
\ \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.020219083895133924\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.673469387755102,\n \"acc_stderr\": 0.03002105623844031,\n\
\ \"acc_norm\": 0.673469387755102,\n \"acc_norm_stderr\": 0.03002105623844031\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\
\ \"acc_stderr\": 0.031157150869355586,\n \"acc_norm\": 0.736318407960199,\n\
\ \"acc_norm_stderr\": 0.031157150869355586\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.03424042924691583,\n\
\ \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.03424042924691583\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37821297429620565,\n\
\ \"mc1_stderr\": 0.016976335907546866,\n \"mc2\": 0.5379290576758808,\n\
\ \"mc2_stderr\": 0.01514579551273296\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.739542225730071,\n \"acc_stderr\": 0.012334833671998285\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.25701288855193327,\n \
\ \"acc_stderr\": 0.012036781757428675\n }\n}\n```"
repo_url: https://huggingface.co/WebraftAI/synapsellm-7b-mistral-v0.4-preview2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|arc:challenge|25_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|gsm8k|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hellaswag|10_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T19-57-57.872670.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T19-57-57.872670.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- '**/details_harness|winogrande|5_2023-12-09T19-57-57.872670.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-09T19-57-57.872670.parquet'
- config_name: results
data_files:
- split: 2023_12_09T19_57_57.872670
path:
- results_2023-12-09T19-57-57.872670.parquet
- split: latest
path:
- results_2023-12-09T19-57-57.872670.parquet
---
# Dataset Card for Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.4-preview2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/WebraftAI/synapsellm-7b-mistral-v0.4-preview2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [WebraftAI/synapsellm-7b-mistral-v0.4-preview2](https://huggingface.co/WebraftAI/synapsellm-7b-mistral-v0.4-preview2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.4-preview2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T19:57:57.872670](https://huggingface.co/datasets/open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.4-preview2/blob/main/results_2023-12-09T19-57-57.872670.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5440235971329553,
"acc_stderr": 0.03410726380039453,
"acc_norm": 0.5490928177495088,
"acc_norm_stderr": 0.03483965758622219,
"mc1": 0.37821297429620565,
"mc1_stderr": 0.016976335907546866,
"mc2": 0.5379290576758808,
"mc2_stderr": 0.01514579551273296
},
"harness|arc:challenge|25": {
"acc": 0.5093856655290102,
"acc_stderr": 0.014608816322065,
"acc_norm": 0.5298634812286689,
"acc_norm_stderr": 0.014585305840007105
},
"harness|hellaswag|10": {
"acc": 0.5582553276239793,
"acc_stderr": 0.004955798214513426,
"acc_norm": 0.7453694483170683,
"acc_norm_stderr": 0.004347629889040944
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.042667634040995814,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.042667634040995814
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.040463368839782514,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.040463368839782514
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6,
"acc_stderr": 0.03015113445777629,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03015113445777629
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04122728707651282,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04122728707651282
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.451063829787234,
"acc_stderr": 0.03252909619613197,
"acc_norm": 0.451063829787234,
"acc_norm_stderr": 0.03252909619613197
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.045796394220704334,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.045796394220704334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.0248708152510571,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.0248708152510571
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6419354838709678,
"acc_stderr": 0.02727389059430064,
"acc_norm": 0.6419354838709678,
"acc_norm_stderr": 0.02727389059430064
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.37438423645320196,
"acc_stderr": 0.03405155380561952,
"acc_norm": 0.37438423645320196,
"acc_norm_stderr": 0.03405155380561952
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.036810508691615486,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.036810508691615486
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7150259067357513,
"acc_stderr": 0.03257714077709662,
"acc_norm": 0.7150259067357513,
"acc_norm_stderr": 0.03257714077709662
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.517948717948718,
"acc_stderr": 0.025334667080954915,
"acc_norm": 0.517948717948718,
"acc_norm_stderr": 0.025334667080954915
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085626,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5462184873949579,
"acc_stderr": 0.032339434681820885,
"acc_norm": 0.5462184873949579,
"acc_norm_stderr": 0.032339434681820885
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7137614678899082,
"acc_stderr": 0.01937943662891999,
"acc_norm": 0.7137614678899082,
"acc_norm_stderr": 0.01937943662891999
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044811,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044811
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.03242661719827218,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.03242661719827218
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.70042194092827,
"acc_stderr": 0.029818024749753088,
"acc_norm": 0.70042194092827,
"acc_norm_stderr": 0.029818024749753088
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6694214876033058,
"acc_stderr": 0.04294340845212093,
"acc_norm": 0.6694214876033058,
"acc_norm_stderr": 0.04294340845212093
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.045245960070300476,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.045245960070300476
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046734,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046734
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973647,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973647
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.046561471100123514,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.046561471100123514
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7420178799489144,
"acc_stderr": 0.01564583018834895,
"acc_norm": 0.7420178799489144,
"acc_norm_stderr": 0.01564583018834895
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5751445086705202,
"acc_stderr": 0.026613350840261743,
"acc_norm": 0.5751445086705202,
"acc_norm_stderr": 0.026613350840261743
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.20446927374301677,
"acc_stderr": 0.013488813404711903,
"acc_norm": 0.20446927374301677,
"acc_norm_stderr": 0.013488813404711903
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.02782610930728369,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.02782610930728369
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6045016077170418,
"acc_stderr": 0.02777091853142784,
"acc_norm": 0.6045016077170418,
"acc_norm_stderr": 0.02777091853142784
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5987654320987654,
"acc_stderr": 0.0272725828498398,
"acc_norm": 0.5987654320987654,
"acc_norm_stderr": 0.0272725828498398
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3723404255319149,
"acc_stderr": 0.028838921471251458,
"acc_norm": 0.3723404255319149,
"acc_norm_stderr": 0.028838921471251458
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39374185136897,
"acc_stderr": 0.012478532272564447,
"acc_norm": 0.39374185136897,
"acc_norm_stderr": 0.012478532272564447
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5772058823529411,
"acc_stderr": 0.03000856284500348,
"acc_norm": 0.5772058823529411,
"acc_norm_stderr": 0.03000856284500348
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.020219083895133924,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.020219083895133924
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.673469387755102,
"acc_stderr": 0.03002105623844031,
"acc_norm": 0.673469387755102,
"acc_norm_stderr": 0.03002105623844031
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.031157150869355586,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.031157150869355586
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7251461988304093,
"acc_stderr": 0.03424042924691583,
"acc_norm": 0.7251461988304093,
"acc_norm_stderr": 0.03424042924691583
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37821297429620565,
"mc1_stderr": 0.016976335907546866,
"mc2": 0.5379290576758808,
"mc2_stderr": 0.01514579551273296
},
"harness|winogrande|5": {
"acc": 0.739542225730071,
"acc_stderr": 0.012334833671998285
},
"harness|gsm8k|5": {
"acc": 0.25701288855193327,
"acc_stderr": 0.012036781757428675
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
davanstrien/mapsnlsloaded | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
0: no building or railspace
1: railspace
2: building
3: railspace and non railspace building
- name: map_sheet
dtype: string
splits:
- name: test
num_bytes: 323743326.376
num_examples: 12404
- name: train
num_bytes: 957911247.448
num_examples: 37212
- name: validation
num_bytes: 316304202.708
num_examples: 12404
download_size: 1599110547
dataset_size: 1597958776.5319998
---
# Dataset Card for "mapsnlsloaded"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
amanneo/collected-mail-corpus-mini | ---
dataset_info:
features:
- name: id
dtype: float64
- name: email_type
dtype: string
- name: text
dtype: string
- name: mail_length
dtype: int64
splits:
- name: test
num_bytes: 4260.131707317073
num_examples: 21
- name: train
num_bytes: 37326.86829268293
num_examples: 184
download_size: 26719
dataset_size: 41587.0
---
# Dataset Card for "collected-mail-corpus-mini"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Bartolomeupaiva20/Eudes_carvalho_modelo_de-voz | ---
license: openrail
---
|
joey234/rotten_tomatoes_affix_neg | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': neg
'1': pos
- name: words_with_affixes
sequence: string
- name: sentence_replace_affix
dtype: string
splits:
- name: test
num_bytes: 32423
num_examples: 108
download_size: 25881
dataset_size: 32423
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "rotten_tomatoes_affix_neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shidowake/augmxnt_ultra-orca-boros-en-ja-v1_split_3 | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: weight
dtype: float64
- name: source
dtype: string
splits:
- name: train
num_bytes: 20639999.933149945
num_examples: 9397
download_size: 10638579
dataset_size: 20639999.933149945
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
UPNAdroneLab/powerline_towers | ---
license: cc-by-nc-sa-4.0
size_categories:
- n<1K
pretty_name: powerline_towers
---
# Power Line Towers Dataset
The dataset comprises 860 aerial images of power line towers captured by UAVs using RGB cameras. Specifically intended for image classification tasks, each tower in the dataset has been meticulously annotated in YOLO format, offering a valuable resource for training and evaluating computer vision models in the context of power line tower recognition.
## Dataset Details

* The RGB images are stored in a single folder.
* The annotations are stored in a single folder that contains one file per image, which is identified by the same name.
The annotations are provided in YOLO format: class, x_center, y_center, width, height.
All the values are presented as a proportion of the image width and height, which is constant for all the images.
### Dataset Description
- **Curated by:** UPNAdrone: Drones Laboratory at Universidad Pública de Navarra
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** UPNAdrone: Drones Laboratory at Universidad Pública de Navarra
- **Language(s) (NLP):** N/A
- **License:** CC BY-NC-SA 4.0 (https://creativecommons.org/licenses/by-nc-sa/4.0/)
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
### Direct Use
Aerial image classification for power line inspection tasks.
## Dataset Creation
### Curation Rationale
Research.
### Source Data
All the data has been obtained from our own inspection flights carried out for research purposes.
#### Data Collection and Processing
The data has been manually inspected, processed and annotated.
#### Annotation process
Manual annotation has been carried out for every single image using CVAT.
#### Personal and Sensitive Information
The authors state that there is no known personal nor sensitive information in the provided dataset.
## Bias, Risks, and Limitations
This dataset is intended for research purposes. Therefore, commercial use of the following dataset is not permitted.
### Recommendations
The authors explicitly disclaim any responsibility associated with the misuse of the dataset.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
WIP
**APA:**
WIP
## Dataset Card Contact
For support and/or questions, please get in touch directly with UPNAdrone: https://github.com/UPNAdrone
|
tyzhu/squad_baseline_v4_train_30_eval_10 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 172536
num_examples: 159
- name: validation
num_bytes: 47457
num_examples: 50
download_size: 52942
dataset_size: 219993
---
# Dataset Card for "squad_baseline_v4_train_30_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bellagio-ai/t2i-hoan-kiem-lake | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 5768681.0
num_examples: 23
download_size: 5719900
dataset_size: 5768681.0
---
# Dataset Card for "t2i-hoan-kiem-lake"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Inv__MoeMoE-2x7b | ---
pretty_name: Evaluation run of Inv/MoeMoE-2x7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Inv/MoeMoE-2x7b](https://huggingface.co/Inv/MoeMoE-2x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Inv__MoeMoE-2x7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-11T19:28:29.484742](https://huggingface.co/datasets/open-llm-leaderboard/details_Inv__MoeMoE-2x7b/blob/main/results_2024-03-11T19-28-29.484742.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6310191137887649,\n\
\ \"acc_stderr\": 0.03256565468316543,\n \"acc_norm\": 0.6313280552955073,\n\
\ \"acc_norm_stderr\": 0.03323284549215567,\n \"mc1\": 0.44430844553243576,\n\
\ \"mc1_stderr\": 0.01739458625074317,\n \"mc2\": 0.6165388259112794,\n\
\ \"mc2_stderr\": 0.015469241789129546\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6322525597269625,\n \"acc_stderr\": 0.014090995618168478,\n\
\ \"acc_norm\": 0.6646757679180887,\n \"acc_norm_stderr\": 0.01379618294778556\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6715793666600279,\n\
\ \"acc_stderr\": 0.004686789042445369,\n \"acc_norm\": 0.8430591515634336,\n\
\ \"acc_norm_stderr\": 0.0036300159898963956\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.037827289808654706,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.037827289808654706\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.02845015479411864,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.02845015479411864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411018,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411018\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n\
\ \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778405,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778405\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7451612903225806,\n\
\ \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.7451612903225806,\n\
\ \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.02293514405391943,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.02293514405391943\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \
\ \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524575,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524575\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566548,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566548\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8186274509803921,\n \"acc_stderr\": 0.02704462171947408,\n \"\
acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.02704462171947408\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621126,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621126\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.036230899157241474,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.036230899157241474\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8020434227330779,\n\
\ \"acc_stderr\": 0.014248873549217575,\n \"acc_norm\": 0.8020434227330779,\n\
\ \"acc_norm_stderr\": 0.014248873549217575\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3407821229050279,\n\
\ \"acc_stderr\": 0.0158520024498621,\n \"acc_norm\": 0.3407821229050279,\n\
\ \"acc_norm_stderr\": 0.0158520024498621\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046633,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046633\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788513,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788513\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.025329888171900926,\n\
\ \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.025329888171900926\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.42907801418439717,\n \"acc_stderr\": 0.029525914302558555,\n \
\ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.029525914302558555\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n\
\ \"acc_stderr\": 0.012741974333897219,\n \"acc_norm\": 0.4667535853976532,\n\
\ \"acc_norm_stderr\": 0.012741974333897219\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.02976826352893311,\n\
\ \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.02976826352893311\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.630718954248366,\n \"acc_stderr\": 0.01952431674486635,\n \
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.01952431674486635\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7910447761194029,\n\
\ \"acc_stderr\": 0.028748298931728655,\n \"acc_norm\": 0.7910447761194029,\n\
\ \"acc_norm_stderr\": 0.028748298931728655\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.44430844553243576,\n\
\ \"mc1_stderr\": 0.01739458625074317,\n \"mc2\": 0.6165388259112794,\n\
\ \"mc2_stderr\": 0.015469241789129546\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7987371744277821,\n \"acc_stderr\": 0.01126851997157768\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6489764973464746,\n \
\ \"acc_stderr\": 0.013146945941397226\n }\n}\n```"
repo_url: https://huggingface.co/Inv/MoeMoE-2x7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|arc:challenge|25_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|gsm8k|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hellaswag|10_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T19-28-29.484742.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T19-28-29.484742.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- '**/details_harness|winogrande|5_2024-03-11T19-28-29.484742.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-11T19-28-29.484742.parquet'
- config_name: results
data_files:
- split: 2024_03_11T19_28_29.484742
path:
- results_2024-03-11T19-28-29.484742.parquet
- split: latest
path:
- results_2024-03-11T19-28-29.484742.parquet
---
# Dataset Card for Evaluation run of Inv/MoeMoE-2x7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Inv/MoeMoE-2x7b](https://huggingface.co/Inv/MoeMoE-2x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Inv__MoeMoE-2x7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-11T19:28:29.484742](https://huggingface.co/datasets/open-llm-leaderboard/details_Inv__MoeMoE-2x7b/blob/main/results_2024-03-11T19-28-29.484742.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6310191137887649,
"acc_stderr": 0.03256565468316543,
"acc_norm": 0.6313280552955073,
"acc_norm_stderr": 0.03323284549215567,
"mc1": 0.44430844553243576,
"mc1_stderr": 0.01739458625074317,
"mc2": 0.6165388259112794,
"mc2_stderr": 0.015469241789129546
},
"harness|arc:challenge|25": {
"acc": 0.6322525597269625,
"acc_stderr": 0.014090995618168478,
"acc_norm": 0.6646757679180887,
"acc_norm_stderr": 0.01379618294778556
},
"harness|hellaswag|10": {
"acc": 0.6715793666600279,
"acc_stderr": 0.004686789042445369,
"acc_norm": 0.8430591515634336,
"acc_norm_stderr": 0.0036300159898963956
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.037827289808654706,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.037827289808654706
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.02845015479411864,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.02845015479411864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778405,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778405
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7451612903225806,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.7451612903225806,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047711,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047711
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.02293514405391943,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.02293514405391943
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524575,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524575
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566548,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.02704462171947408,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.02704462171947408
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621126,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621126
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.036230899157241474,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.036230899157241474
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8020434227330779,
"acc_stderr": 0.014248873549217575,
"acc_norm": 0.8020434227330779,
"acc_norm_stderr": 0.014248873549217575
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3407821229050279,
"acc_stderr": 0.0158520024498621,
"acc_norm": 0.3407821229050279,
"acc_norm_stderr": 0.0158520024498621
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.026336613469046633,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.026336613469046633
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788513,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788513
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.025329888171900926,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.025329888171900926
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.029525914302558555,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.029525914302558555
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.012741974333897219,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.012741974333897219
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5992647058823529,
"acc_stderr": 0.02976826352893311,
"acc_norm": 0.5992647058823529,
"acc_norm_stderr": 0.02976826352893311
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.01952431674486635,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.01952431674486635
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304328,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304328
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7910447761194029,
"acc_stderr": 0.028748298931728655,
"acc_norm": 0.7910447761194029,
"acc_norm_stderr": 0.028748298931728655
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.44430844553243576,
"mc1_stderr": 0.01739458625074317,
"mc2": 0.6165388259112794,
"mc2_stderr": 0.015469241789129546
},
"harness|winogrande|5": {
"acc": 0.7987371744277821,
"acc_stderr": 0.01126851997157768
},
"harness|gsm8k|5": {
"acc": 0.6489764973464746,
"acc_stderr": 0.013146945941397226
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
epts/joyokanji | ---
license: mit
---
|
GreeneryScenery/SheepsDiffusion | ---
dataset_info:
features:
- name: image
dtype: image
- name: prompt
dtype: string
- name: square_image
dtype: image
- name: conditioning_image
dtype: image
splits:
- name: train
num_bytes: 10700034070.0
num_examples: 10000
download_size: 10815458379
dataset_size: 10700034070.0
---
# Dataset Card for "SheepsDiffusion"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
japanese-asr/whisper_transcriptions.reazonspeech.all_28 | ---
dataset_info:
config_name: all
features:
- name: name
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: whisper_transcript
sequence: int64
splits:
- name: train
num_bytes: 30418826481.0
num_examples: 267179
download_size: 30176859052
dataset_size: 30418826481.0
configs:
- config_name: all
data_files:
- split: train
path: all/train-*
---
|
tyzhu/random_letter_same_length_find_passage_train30_eval10_num | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 21610
num_examples: 70
- name: validation
num_bytes: 3230
num_examples: 10
download_size: 17427
dataset_size: 24840
---
# Dataset Card for "random_letter_same_length_find_passage_train30_eval10_num"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.