datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
AmanK1202/LogoGeneration_png | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 120298419.0
num_examples: 821
download_size: 120174466
dataset_size: 120298419.0
---
# Dataset Card for "LogoGeneration_png"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-6fbfec76-7855041 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- samsum
eval_info:
task: summarization
model: knkarthick/bart-large-xsum-samsum
metrics: []
dataset_name: samsum
dataset_config: samsum
dataset_split: test
col_mapping:
text: dialogue
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: knkarthick/bart-large-xsum-samsum
* Dataset: samsum
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
erwinqi/conslam_relabelled_semantic_reduced | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 401239316.0
num_examples: 88
- name: validation
num_bytes: 49226249.0
num_examples: 10
download_size: 450462129
dataset_size: 450465565.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
matvelen6369/eeg_data | ---
dataset_info:
features:
- name: eeg
sequence: float64
splits:
- name: train
num_bytes: 6066918000
num_examples: 49716
download_size: 4934969605
dataset_size: 6066918000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tyzhu/squad_wrong_title_v4_train_30_eval_10_recite | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 681763
num_examples: 368
- name: validation
num_bytes: 84048
num_examples: 50
download_size: 137622
dataset_size: 765811
---
# Dataset Card for "squad_wrong_title_v4_train_30_eval_10_recite"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ThWu/dpo_highest_n_random | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 519912724
num_examples: 182470
download_size: 243211283
dataset_size: 519912724
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "dpo_highest_n_random"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wangyi111/EuroSAT-SAR | ---
license: mit
task_categories:
- image-classification
---
## EuroSAT-SAR: Land Use and Land Cover Classification with Sentinel-1
The EuroSAT-SAR dataset is a SAR version of the popular [EuroSAT](https://github.com/phelber/EuroSAT) dataset. We matched each Sentinel-2 image in EuroSAT with one Sentinel-1 patch according to the geospatial coordinates, ending up with 27,000 dual-pol Sentinel-1 SAR images divided in 10 classes. The EuroSAT-SAR dataset was collected as one downstream task in the work [FG-MAE](https://github.com/zhu-xlab/FGMAE) to serve as a CIFAR-like, clean, balanced ML-ready dataset for remote sensing SAR image recognition.
<p align="center">
<img width="1000" alt="fgmae main structure" src="assets/eurosat-sar.png">
</p>
The dataset can be downloaded as a compressed zip file [here](https://huggingface.co/datasets/wangyi111/EuroSAT-SAR/resolve/main/EuroSAT-SAR.zip).
### Citation
```bibtex
@article{wang2023feature,
title={Feature Guided Masked Autoencoder for Self-supervised Learning in Remote Sensing},
author={Wang, Yi and Hern{\'a}ndez, Hugo Hern{\'a}ndez and Albrecht, Conrad M and Zhu, Xiao Xiang},
journal={arXiv preprint arXiv:2310.18653},
year={2023}
}
```
|
jiaqianjing/PatentData | ---
license: gpl-3.0
task_categories:
- text-generation
language:
- zh
tags:
- patent
---
## 数据来源
**[中国专利信息中心](https://patdata2.cnipa.gov.cn/)**
## 字段解释
* patent_id:专利编号
* patent_pub_date:专利公布日期
* title:专利名称
* applicant:申请人/单位
* application_date:申请日期
* inventors:发明人
* summary:摘要
* description:说明书全文
* claim:专利权利要求书全文
## 使用限制
仅允许将此数据集及使用此数据集生成的衍生物用于研究目的,不得用于商业,以及其他会对社会带来危害的用途。 本数据集不代表任何一方的立场、利益或想法,无关任何团体的任何类型的主张。因使用本数据集带来的任何损害、纠纷,本项目不承担任何责任。 |
autoevaluate/autoeval-eval-project-squad_v2-7b0e814c-1303349869 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- squad_v2
eval_info:
task: extractive_question_answering
model: nbroad/rob-base-superqa2
metrics: []
dataset_name: squad_v2
dataset_config: squad_v2
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: nbroad/rob-base-superqa2
* Dataset: squad_v2
* Config: squad_v2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@nbroad](https://huggingface.co/nbroad) for evaluating this model. |
loremipsum3658/sick-br | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: pair_ID
dtype: int64
- name: sentence_A
dtype: string
- name: sentence_B
dtype: string
- name: entailment_label
dtype: string
- name: relatedness_score
dtype: float64
- name: entailment_AB
dtype: string
- name: entailment_BA
dtype: string
- name: sentence_A_original
dtype: string
- name: sentence_B_original
dtype: string
- name: sentence_A_dataset
dtype: string
- name: sentence_B_dataset
dtype: string
- name: SemEval_set
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 2196243
num_examples: 6887
- name: test
num_bytes: 470001
num_examples: 1477
- name: validation
num_bytes: 470022
num_examples: 1476
download_size: 1217241
dataset_size: 3136266
---
# Dataset Card for "sick-br"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nateraw/imagenet-sketch | ---
license: mit
---
|
bigbio/mirna |
---
language:
- en
bigbio_language:
- English
license: cc-by-nc-3.0
multilinguality: monolingual
bigbio_license_shortname: CC_BY_NC_3p0
pretty_name: miRNA
homepage: https://www.scai.fraunhofer.de/en/business-research-areas/bioinformatics/downloads/download-mirna-test-corpus.html
bigbio_pubmed: True
bigbio_public: True
bigbio_tasks:
- NAMED_ENTITY_RECOGNITION
- NAMED_ENTITY_DISAMBIGUATION
---
# Dataset Card for miRNA
## Dataset Description
- **Homepage:** https://www.scai.fraunhofer.de/en/business-research-areas/bioinformatics/downloads/download-mirna-test-corpus.html
- **Pubmed:** True
- **Public:** True
- **Tasks:** NER,NED
The corpus consists of 301 Medline citations. The documents were screened for
mentions of miRNA in the abstract text. Gene, disease and miRNA entities were manually
annotated. The corpus comprises of two separate files, a train and a test set, coming
from 201 and 100 documents respectively.
## Citation Information
```
@Article{Bagewadi2014,
author={Bagewadi, Shweta
and Bobi{'{c}}, Tamara
and Hofmann-Apitius, Martin
and Fluck, Juliane
and Klinger, Roman},
title={Detecting miRNA Mentions and Relations in Biomedical Literature},
journal={F1000Research},
year={2014},
month={Aug},
day={28},
publisher={F1000Research},
volume={3},
pages={205-205},
keywords={MicroRNAs; corpus; prediction algorithms},
abstract={
INTRODUCTION: MicroRNAs (miRNAs) have demonstrated their potential as post-transcriptional
gene expression regulators, participating in a wide spectrum of regulatory events such as
apoptosis, differentiation, and stress response. Apart from the role of miRNAs in normal
physiology, their dysregulation is implicated in a vast array of diseases. Dissection of
miRNA-related associations are valuable for contemplating their mechanism in diseases,
leading to the discovery of novel miRNAs for disease prognosis, diagnosis, and therapy.
MOTIVATION: Apart from databases and prediction tools, miRNA-related information is largely
available as unstructured text. Manual retrieval of these associations can be labor-intensive
due to steadily growing number of publications. Additionally, most of the published miRNA
entity recognition methods are keyword based, further subjected to manual inspection for
retrieval of relations. Despite the fact that several databases host miRNA-associations
derived from text, lower sensitivity and lack of published details for miRNA entity
recognition and associated relations identification has motivated the need for developing
comprehensive methods that are freely available for the scientific community. Additionally,
the lack of a standard corpus for miRNA-relations has caused difficulty in evaluating the
available systems. We propose methods to automatically extract mentions of miRNAs, species,
genes/proteins, disease, and relations from scientific literature. Our generated corpora,
along with dictionaries, and miRNA regular expression are freely available for academic
purposes. To our knowledge, these resources are the most comprehensive developed so far.
RESULTS: The identification of specific miRNA mentions reaches a recall of 0.94 and
precision of 0.93. Extraction of miRNA-disease and miRNA-gene relations lead to an
F1 score of up to 0.76. A comparison of the information extracted by our approach to
the databases miR2Disease and miRSel for the extraction of Alzheimer's disease
related relations shows the capability of our proposed methods in identifying correct
relations with improved sensitivity. The published resources and described methods can
help the researchers for maximal retrieval of miRNA-relations and generation of
miRNA-regulatory networks. AVAILABILITY: The training and test corpora, annotation
guidelines, developed dictionaries, and supplementary files are available at
http://www.scai.fraunhofer.de/mirna-corpora.html.
},
note={26535109[pmid]},
note={PMC4602280[pmcid]},
issn={2046-1402},
url={https://pubmed.ncbi.nlm.nih.gov/26535109},
language={eng}
}
```
|
bertbsb/prtimvoz | ---
license: openrail
---
|
sabaaziz24/_ | ---
license: openrail
---
|
bdsaglam/musique-answerable-2hop-subset-erx-reward-2023-12-30T17-40-15 | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: reward
dtype: int64
splits:
- name: train
num_bytes: 128205
num_examples: 90
download_size: 18415
dataset_size: 128205
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tyzhu/fw_num_bi_train_100_eval_100 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: train_doc2id
path: data/train_doc2id-*
- split: train_id2doc
path: data/train_id2doc-*
- split: train_find_word
path: data/train_find_word-*
- split: eval_find_word
path: data/eval_find_word-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 36899
num_examples: 500
- name: train_doc2id
num_bytes: 15892
num_examples: 200
- name: train_id2doc
num_bytes: 16492
num_examples: 200
- name: train_find_word
num_bytes: 4515
num_examples: 100
- name: eval_find_word
num_bytes: 4623
num_examples: 100
download_size: 44830
dataset_size: 78421
---
# Dataset Card for "fw_num_bi_train_100_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BByrneLab/EVQA_PreFLMR_preprocessed_passages | ---
dataset_info:
features:
- name: language
dtype: string
- name: passage_id
dtype: string
- name: passage_content
dtype: string
splits:
- name: train_passages
num_bytes: 58570897
num_examples: 50205
- name: valid_passages
num_bytes: 59117345
num_examples: 50753
- name: test_passages
num_bytes: 60113716
num_examples: 51472
download_size: 106160568
dataset_size: 177801958
configs:
- config_name: default
data_files:
- split: train_passages
path: data/train_passages-*
- split: valid_passages
path: data/valid_passages-*
- split: test_passages
path: data/test_passages-*
---
|
yuvalkirstain/pexel_images_lots_with_generated_captions | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: generated_caption
dtype: string
splits:
- name: train
num_bytes: 2467169489.125
num_examples: 7999
download_size: 2418777187
dataset_size: 2467169489.125
---
# Dataset Card for "pexel_images_lots_with_generated_captions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_HuggingFaceTB__cosmo-1b | ---
pretty_name: Evaluation run of HuggingFaceTB/cosmo-1b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [HuggingFaceTB/cosmo-1b](https://huggingface.co/HuggingFaceTB/cosmo-1b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_HuggingFaceTB__cosmo-1b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-20T22:24:29.025319](https://huggingface.co/datasets/open-llm-leaderboard/details_HuggingFaceTB__cosmo-1b/blob/main/results_2024-02-20T22-24-29.025319.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2698889533621004,\n\
\ \"acc_stderr\": 0.0314781880414406,\n \"acc_norm\": 0.2719343408831061,\n\
\ \"acc_norm_stderr\": 0.03221191811445044,\n \"mc1\": 0.2215422276621787,\n\
\ \"mc1_stderr\": 0.014537867601301137,\n \"mc2\": 0.38259377102490544,\n\
\ \"mc2_stderr\": 0.014283688892810937\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3302047781569966,\n \"acc_stderr\": 0.013743085603760427,\n\
\ \"acc_norm\": 0.3856655290102389,\n \"acc_norm_stderr\": 0.014224250973257174\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4182433778131846,\n\
\ \"acc_stderr\": 0.0049226246369452435,\n \"acc_norm\": 0.5507866958773153,\n\
\ \"acc_norm_stderr\": 0.004963974504003025\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.03785714465066654,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.03785714465066654\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3223684210526316,\n \"acc_stderr\": 0.038035102483515854,\n\
\ \"acc_norm\": 0.3223684210526316,\n \"acc_norm_stderr\": 0.038035102483515854\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.28679245283018867,\n \"acc_stderr\": 0.027834912527544088,\n\
\ \"acc_norm\": 0.28679245283018867,\n \"acc_norm_stderr\": 0.027834912527544088\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.28901734104046245,\n\
\ \"acc_stderr\": 0.03456425745087001,\n \"acc_norm\": 0.28901734104046245,\n\
\ \"acc_norm_stderr\": 0.03456425745087001\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929775,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929775\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.31063829787234043,\n \"acc_stderr\": 0.030251237579213167,\n\
\ \"acc_norm\": 0.31063829787234043,\n \"acc_norm_stderr\": 0.030251237579213167\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.16551724137931034,\n \"acc_stderr\": 0.03097055996622408,\n\
\ \"acc_norm\": 0.16551724137931034,\n \"acc_norm_stderr\": 0.03097055996622408\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113942,\n \"\
acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113942\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n\
\ \"acc_stderr\": 0.035670166752768614,\n \"acc_norm\": 0.1984126984126984,\n\
\ \"acc_norm_stderr\": 0.035670166752768614\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2161290322580645,\n\
\ \"acc_stderr\": 0.023415293433568532,\n \"acc_norm\": 0.2161290322580645,\n\
\ \"acc_norm_stderr\": 0.023415293433568532\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.23645320197044334,\n \"acc_stderr\": 0.029896114291733545,\n\
\ \"acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.029896114291733545\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.25252525252525254,\n \"acc_stderr\": 0.030954055470365897,\n \"\
acc_norm\": 0.25252525252525254,\n \"acc_norm_stderr\": 0.030954055470365897\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.20207253886010362,\n \"acc_stderr\": 0.02897908979429673,\n\
\ \"acc_norm\": 0.20207253886010362,\n \"acc_norm_stderr\": 0.02897908979429673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2512820512820513,\n \"acc_stderr\": 0.02199201666237054,\n \
\ \"acc_norm\": 0.2512820512820513,\n \"acc_norm_stderr\": 0.02199201666237054\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.02742001935094528,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.02742001935094528\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.02934457250063435,\n \
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.02934457250063435\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22385321100917432,\n \"acc_stderr\": 0.017871217767790205,\n \"\
acc_norm\": 0.22385321100917432,\n \"acc_norm_stderr\": 0.017871217767790205\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.03114144782353603,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.03114144782353603\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.27450980392156865,\n \"acc_stderr\": 0.03132179803083289,\n \"\
acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.03132179803083289\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2911392405063291,\n \"acc_stderr\": 0.02957160106575337,\n \
\ \"acc_norm\": 0.2911392405063291,\n \"acc_norm_stderr\": 0.02957160106575337\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3183856502242152,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.3183856502242152,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.3140495867768595,\n \"acc_stderr\": 0.04236964753041017,\n \"\
acc_norm\": 0.3140495867768595,\n \"acc_norm_stderr\": 0.04236964753041017\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.18404907975460122,\n \"acc_stderr\": 0.030446777687971757,\n\
\ \"acc_norm\": 0.18404907975460122,\n \"acc_norm_stderr\": 0.030446777687971757\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.04059867246952686,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.04059867246952686\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.23300970873786409,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.23300970873786409,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.26495726495726496,\n\
\ \"acc_stderr\": 0.02891120880274946,\n \"acc_norm\": 0.26495726495726496,\n\
\ \"acc_norm_stderr\": 0.02891120880274946\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26436781609195403,\n\
\ \"acc_stderr\": 0.01576998484069052,\n \"acc_norm\": 0.26436781609195403,\n\
\ \"acc_norm_stderr\": 0.01576998484069052\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.023176298203992023,\n\
\ \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.023176298203992023\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574882,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574882\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.02609016250427904,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.02609016250427904\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26366559485530544,\n\
\ \"acc_stderr\": 0.025025538500532338,\n \"acc_norm\": 0.26366559485530544,\n\
\ \"acc_norm_stderr\": 0.025025538500532338\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.024659685185967267,\n\
\ \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.024659685185967267\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24113475177304963,\n \"acc_stderr\": 0.025518731049537776,\n \
\ \"acc_norm\": 0.24113475177304963,\n \"acc_norm_stderr\": 0.025518731049537776\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2633637548891786,\n\
\ \"acc_stderr\": 0.0112495064036053,\n \"acc_norm\": 0.2633637548891786,\n\
\ \"acc_norm_stderr\": 0.0112495064036053\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.16911764705882354,\n \"acc_stderr\": 0.022770868010113014,\n\
\ \"acc_norm\": 0.16911764705882354,\n \"acc_norm_stderr\": 0.022770868010113014\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2434640522875817,\n \"acc_stderr\": 0.01736247376214663,\n \
\ \"acc_norm\": 0.2434640522875817,\n \"acc_norm_stderr\": 0.01736247376214663\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3090909090909091,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.3090909090909091,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3224489795918367,\n \"acc_stderr\": 0.029923100563683906,\n\
\ \"acc_norm\": 0.3224489795918367,\n \"acc_norm_stderr\": 0.029923100563683906\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21890547263681592,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.21890547263681592,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n\
\ \"acc_stderr\": 0.034605799075530276,\n \"acc_norm\": 0.2710843373493976,\n\
\ \"acc_norm_stderr\": 0.034605799075530276\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.30409356725146197,\n \"acc_stderr\": 0.03528211258245231,\n\
\ \"acc_norm\": 0.30409356725146197,\n \"acc_norm_stderr\": 0.03528211258245231\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2215422276621787,\n\
\ \"mc1_stderr\": 0.014537867601301137,\n \"mc2\": 0.38259377102490544,\n\
\ \"mc2_stderr\": 0.014283688892810937\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.55327545382794,\n \"acc_stderr\": 0.013972488371616692\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.053828658074298714,\n \
\ \"acc_stderr\": 0.0062163286402381465\n }\n}\n```"
repo_url: https://huggingface.co/HuggingFaceTB/cosmo-1b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|arc:challenge|25_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|arc:challenge|25_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|gsm8k|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|gsm8k|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hellaswag|10_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hellaswag|10_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T22-17-27.049029.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T22-24-29.025319.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-20T22-24-29.025319.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- '**/details_harness|winogrande|5_2024-02-20T22-17-27.049029.parquet'
- split: 2024_02_20T22_24_29.025319
path:
- '**/details_harness|winogrande|5_2024-02-20T22-24-29.025319.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-20T22-24-29.025319.parquet'
- config_name: results
data_files:
- split: 2024_02_20T22_17_27.049029
path:
- results_2024-02-20T22-17-27.049029.parquet
- split: 2024_02_20T22_24_29.025319
path:
- results_2024-02-20T22-24-29.025319.parquet
- split: latest
path:
- results_2024-02-20T22-24-29.025319.parquet
---
# Dataset Card for Evaluation run of HuggingFaceTB/cosmo-1b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [HuggingFaceTB/cosmo-1b](https://huggingface.co/HuggingFaceTB/cosmo-1b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_HuggingFaceTB__cosmo-1b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-20T22:24:29.025319](https://huggingface.co/datasets/open-llm-leaderboard/details_HuggingFaceTB__cosmo-1b/blob/main/results_2024-02-20T22-24-29.025319.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2698889533621004,
"acc_stderr": 0.0314781880414406,
"acc_norm": 0.2719343408831061,
"acc_norm_stderr": 0.03221191811445044,
"mc1": 0.2215422276621787,
"mc1_stderr": 0.014537867601301137,
"mc2": 0.38259377102490544,
"mc2_stderr": 0.014283688892810937
},
"harness|arc:challenge|25": {
"acc": 0.3302047781569966,
"acc_stderr": 0.013743085603760427,
"acc_norm": 0.3856655290102389,
"acc_norm_stderr": 0.014224250973257174
},
"harness|hellaswag|10": {
"acc": 0.4182433778131846,
"acc_stderr": 0.0049226246369452435,
"acc_norm": 0.5507866958773153,
"acc_norm_stderr": 0.004963974504003025
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.03785714465066654,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.03785714465066654
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3223684210526316,
"acc_stderr": 0.038035102483515854,
"acc_norm": 0.3223684210526316,
"acc_norm_stderr": 0.038035102483515854
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.28679245283018867,
"acc_stderr": 0.027834912527544088,
"acc_norm": 0.28679245283018867,
"acc_norm_stderr": 0.027834912527544088
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.28901734104046245,
"acc_stderr": 0.03456425745087001,
"acc_norm": 0.28901734104046245,
"acc_norm_stderr": 0.03456425745087001
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929775,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929775
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.31063829787234043,
"acc_stderr": 0.030251237579213167,
"acc_norm": 0.31063829787234043,
"acc_norm_stderr": 0.030251237579213167
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.16551724137931034,
"acc_stderr": 0.03097055996622408,
"acc_norm": 0.16551724137931034,
"acc_norm_stderr": 0.03097055996622408
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.022418042891113942,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.022418042891113942
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.035670166752768614,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.035670166752768614
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2161290322580645,
"acc_stderr": 0.023415293433568532,
"acc_norm": 0.2161290322580645,
"acc_norm_stderr": 0.023415293433568532
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.029896114291733545,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.029896114291733545
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25252525252525254,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.25252525252525254,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.20207253886010362,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.20207253886010362,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2512820512820513,
"acc_stderr": 0.02199201666237054,
"acc_norm": 0.2512820512820513,
"acc_norm_stderr": 0.02199201666237054
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.02742001935094528,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.02742001935094528
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.02934457250063435,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.02934457250063435
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22385321100917432,
"acc_stderr": 0.017871217767790205,
"acc_norm": 0.22385321100917432,
"acc_norm_stderr": 0.017871217767790205
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.03114144782353603,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.03114144782353603
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.03132179803083289,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.03132179803083289
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2911392405063291,
"acc_stderr": 0.02957160106575337,
"acc_norm": 0.2911392405063291,
"acc_norm_stderr": 0.02957160106575337
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3183856502242152,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.3183856502242152,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3140495867768595,
"acc_stderr": 0.04236964753041017,
"acc_norm": 0.3140495867768595,
"acc_norm_stderr": 0.04236964753041017
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.18404907975460122,
"acc_stderr": 0.030446777687971757,
"acc_norm": 0.18404907975460122,
"acc_norm_stderr": 0.030446777687971757
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952686,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952686
},
"harness|hendrycksTest-management|5": {
"acc": 0.23300970873786409,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.23300970873786409,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.26495726495726496,
"acc_stderr": 0.02891120880274946,
"acc_norm": 0.26495726495726496,
"acc_norm_stderr": 0.02891120880274946
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26436781609195403,
"acc_stderr": 0.01576998484069052,
"acc_norm": 0.26436781609195403,
"acc_norm_stderr": 0.01576998484069052
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24566473988439305,
"acc_stderr": 0.023176298203992023,
"acc_norm": 0.24566473988439305,
"acc_norm_stderr": 0.023176298203992023
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574882,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574882
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.02609016250427904,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.02609016250427904
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.26366559485530544,
"acc_stderr": 0.025025538500532338,
"acc_norm": 0.26366559485530544,
"acc_norm_stderr": 0.025025538500532338
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.024659685185967267,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.024659685185967267
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24113475177304963,
"acc_stderr": 0.025518731049537776,
"acc_norm": 0.24113475177304963,
"acc_norm_stderr": 0.025518731049537776
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2633637548891786,
"acc_stderr": 0.0112495064036053,
"acc_norm": 0.2633637548891786,
"acc_norm_stderr": 0.0112495064036053
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.16911764705882354,
"acc_stderr": 0.022770868010113014,
"acc_norm": 0.16911764705882354,
"acc_norm_stderr": 0.022770868010113014
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2434640522875817,
"acc_stderr": 0.01736247376214663,
"acc_norm": 0.2434640522875817,
"acc_norm_stderr": 0.01736247376214663
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3090909090909091,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.3090909090909091,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3224489795918367,
"acc_stderr": 0.029923100563683906,
"acc_norm": 0.3224489795918367,
"acc_norm_stderr": 0.029923100563683906
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21890547263681592,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.21890547263681592,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.034605799075530276,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.034605799075530276
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.30409356725146197,
"acc_stderr": 0.03528211258245231,
"acc_norm": 0.30409356725146197,
"acc_norm_stderr": 0.03528211258245231
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2215422276621787,
"mc1_stderr": 0.014537867601301137,
"mc2": 0.38259377102490544,
"mc2_stderr": 0.014283688892810937
},
"harness|winogrande|5": {
"acc": 0.55327545382794,
"acc_stderr": 0.013972488371616692
},
"harness|gsm8k|5": {
"acc": 0.053828658074298714,
"acc_stderr": 0.0062163286402381465
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
yuvalkirstain/PickaPic-rankings-6-3-2023 | ---
dataset_info:
features:
- name: ranking_id
dtype: int64
- name: created_at
dtype: timestamp[ns]
- name: user_id
dtype: int64
- name: image_1_uid
dtype: string
- name: image_2_uid
dtype: string
- name: image_3_uid
dtype: 'null'
- name: image_4_uid
dtype: 'null'
- name: best_image_uid
dtype: string
- name: prompt
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 16312020
num_examples: 71457
download_size: 5911046
dataset_size: 16312020
---
# Dataset Card for "PickaPic-rankings-6-3-2023"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kadialkad/snli-s | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: test
num_bytes: 125501
num_examples: 1000
- name: train
num_bytes: 5994668
num_examples: 50000
- name: validation
num_bytes: 128470
num_examples: 1000
download_size: 1870273
dataset_size: 6248639
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
strombergnlp/pypi-20211209 | ---
license: apache-2.0
---
|
liujqian/commonsenseqa_with_content_words | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: question_concept
dtype: string
- name: choices
sequence:
- name: label
dtype: string
- name: text
dtype: string
- name: answerKey
dtype: string
- name: question_content_words
sequence: string
- name: choice_0_content_words
sequence: string
- name: choice_1_content_words
sequence: string
- name: choice_2_content_words
sequence: string
- name: choice_3_content_words
sequence: string
- name: choice_4_content_words
sequence: string
splits:
- name: train
num_bytes: 3595329
num_examples: 9741
- name: validation
num_bytes: 446090
num_examples: 1221
- name: test
num_bytes: 419929
num_examples: 1140
download_size: 2361458
dataset_size: 4461348
---
# Dataset Card for "commonsenseqa_with_content_words"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_CalderaAI__13B-Thorns-l2 | ---
pretty_name: Evaluation run of CalderaAI/13B-Thorns-l2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CalderaAI/13B-Thorns-l2](https://huggingface.co/CalderaAI/13B-Thorns-l2) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CalderaAI__13B-Thorns-l2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T07:53:37.765793](https://huggingface.co/datasets/open-llm-leaderboard/details_CalderaAI__13B-Thorns-l2/blob/main/results_2023-10-24T07-53-37.765793.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.38873741610738255,\n\
\ \"em_stderr\": 0.004992082219869444,\n \"f1\": 0.4612814597315456,\n\
\ \"f1_stderr\": 0.004772539023607796,\n \"acc\": 0.3770824444865971,\n\
\ \"acc_stderr\": 0.007432066740076047\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.38873741610738255,\n \"em_stderr\": 0.004992082219869444,\n\
\ \"f1\": 0.4612814597315456,\n \"f1_stderr\": 0.004772539023607796\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.009097801364670205,\n \
\ \"acc_stderr\": 0.0026153265107756716\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.745067087608524,\n \"acc_stderr\": 0.012248806969376422\n\
\ }\n}\n```"
repo_url: https://huggingface.co/CalderaAI/13B-Thorns-l2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|arc:challenge|25_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T07_53_37.765793
path:
- '**/details_harness|drop|3_2023-10-24T07-53-37.765793.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T07-53-37.765793.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T07_53_37.765793
path:
- '**/details_harness|gsm8k|5_2023-10-24T07-53-37.765793.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T07-53-37.765793.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hellaswag|10_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T17-37-55.153820.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T17-37-55.153820.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T17-37-55.153820.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T07_53_37.765793
path:
- '**/details_harness|winogrande|5_2023-10-24T07-53-37.765793.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T07-53-37.765793.parquet'
- config_name: results
data_files:
- split: 2023_09_12T17_37_55.153820
path:
- results_2023-09-12T17-37-55.153820.parquet
- split: 2023_10_24T07_53_37.765793
path:
- results_2023-10-24T07-53-37.765793.parquet
- split: latest
path:
- results_2023-10-24T07-53-37.765793.parquet
---
# Dataset Card for Evaluation run of CalderaAI/13B-Thorns-l2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CalderaAI/13B-Thorns-l2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CalderaAI/13B-Thorns-l2](https://huggingface.co/CalderaAI/13B-Thorns-l2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CalderaAI__13B-Thorns-l2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T07:53:37.765793](https://huggingface.co/datasets/open-llm-leaderboard/details_CalderaAI__13B-Thorns-l2/blob/main/results_2023-10-24T07-53-37.765793.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.38873741610738255,
"em_stderr": 0.004992082219869444,
"f1": 0.4612814597315456,
"f1_stderr": 0.004772539023607796,
"acc": 0.3770824444865971,
"acc_stderr": 0.007432066740076047
},
"harness|drop|3": {
"em": 0.38873741610738255,
"em_stderr": 0.004992082219869444,
"f1": 0.4612814597315456,
"f1_stderr": 0.004772539023607796
},
"harness|gsm8k|5": {
"acc": 0.009097801364670205,
"acc_stderr": 0.0026153265107756716
},
"harness|winogrande|5": {
"acc": 0.745067087608524,
"acc_stderr": 0.012248806969376422
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mtek2000/hausa_topic_classification | ---
license: mit
---
|
liuyanchen1015/MULTI_VALUE_mnli_null_relcl | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 463647
num_examples: 1796
- name: dev_mismatched
num_bytes: 549705
num_examples: 2124
- name: test_matched
num_bytes: 497422
num_examples: 1941
- name: test_mismatched
num_bytes: 532226
num_examples: 2111
- name: train
num_bytes: 19712790
num_examples: 76988
download_size: 13857134
dataset_size: 21755790
---
# Dataset Card for "MULTI_VALUE_mnli_null_relcl"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sam2ai/hindi_truthfulqa_gen_mini | ---
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
dataset_info:
features:
- name: type
dtype: string
- name: category
dtype: string
- name: question
dtype: string
- name: best_answer
dtype: string
- name: correct_answers
sequence: string
- name: incorrect_answers
sequence: string
- name: source
dtype: string
splits:
- name: validation
num_bytes: 81430
num_examples: 50
download_size: 35995
dataset_size: 81430
---
# Dataset Card for "hindi_truthfulqa_gen_mini"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_SyedAbdul__test-7B-slerp | ---
pretty_name: Evaluation run of SyedAbdul/test-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [SyedAbdul/test-7B-slerp](https://huggingface.co/SyedAbdul/test-7B-slerp) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SyedAbdul__test-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-04T13:37:15.686780](https://huggingface.co/datasets/open-llm-leaderboard/details_SyedAbdul__test-7B-slerp/blob/main/results_2024-01-04T13-37-15.686780.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6489692801245827,\n\
\ \"acc_stderr\": 0.03208856045898211,\n \"acc_norm\": 0.649945556226307,\n\
\ \"acc_norm_stderr\": 0.03273989306811606,\n \"mc1\": 0.46266829865361075,\n\
\ \"mc1_stderr\": 0.01745464515097059,\n \"mc2\": 0.6259520494051883,\n\
\ \"mc2_stderr\": 0.014977076792645322\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6450511945392492,\n \"acc_stderr\": 0.01398303690409409,\n\
\ \"acc_norm\": 0.6808873720136519,\n \"acc_norm_stderr\": 0.013621696119173311\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6744672376020713,\n\
\ \"acc_stderr\": 0.004676159299105416,\n \"acc_norm\": 0.8607847042421828,\n\
\ \"acc_norm_stderr\": 0.003454635760066236\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43386243386243384,\n \"acc_stderr\": 0.02552503438247489,\n \"\
acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.02552503438247489\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066496,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291932,\n\
\ \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291932\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n\
\ \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290913,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290913\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624734,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624734\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.03602814176392645,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.03602814176392645\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281376,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281376\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n\
\ \"acc_stderr\": 0.013306478243066302,\n \"acc_norm\": 0.8339719029374202,\n\
\ \"acc_norm_stderr\": 0.013306478243066302\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468365,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468365\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3854748603351955,\n\
\ \"acc_stderr\": 0.016277927039638193,\n \"acc_norm\": 0.3854748603351955,\n\
\ \"acc_norm_stderr\": 0.016277927039638193\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.02616058445014045,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.02616058445014045\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.02368359183700856,\n\
\ \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.02368359183700856\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4471968709256845,\n\
\ \"acc_stderr\": 0.012698825252435106,\n \"acc_norm\": 0.4471968709256845,\n\
\ \"acc_norm_stderr\": 0.012698825252435106\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306032,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306032\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.46266829865361075,\n\
\ \"mc1_stderr\": 0.01745464515097059,\n \"mc2\": 0.6259520494051883,\n\
\ \"mc2_stderr\": 0.014977076792645322\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8082083662194159,\n \"acc_stderr\": 0.011065209664659527\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6542835481425322,\n \
\ \"acc_stderr\": 0.01310042299044157\n }\n}\n```"
repo_url: https://huggingface.co/SyedAbdul/test-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|arc:challenge|25_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|gsm8k|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hellaswag|10_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T13-37-15.686780.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T13-37-15.686780.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- '**/details_harness|winogrande|5_2024-01-04T13-37-15.686780.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-04T13-37-15.686780.parquet'
- config_name: results
data_files:
- split: 2024_01_04T13_37_15.686780
path:
- results_2024-01-04T13-37-15.686780.parquet
- split: latest
path:
- results_2024-01-04T13-37-15.686780.parquet
---
# Dataset Card for Evaluation run of SyedAbdul/test-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SyedAbdul/test-7B-slerp](https://huggingface.co/SyedAbdul/test-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SyedAbdul__test-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-04T13:37:15.686780](https://huggingface.co/datasets/open-llm-leaderboard/details_SyedAbdul__test-7B-slerp/blob/main/results_2024-01-04T13-37-15.686780.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6489692801245827,
"acc_stderr": 0.03208856045898211,
"acc_norm": 0.649945556226307,
"acc_norm_stderr": 0.03273989306811606,
"mc1": 0.46266829865361075,
"mc1_stderr": 0.01745464515097059,
"mc2": 0.6259520494051883,
"mc2_stderr": 0.014977076792645322
},
"harness|arc:challenge|25": {
"acc": 0.6450511945392492,
"acc_stderr": 0.01398303690409409,
"acc_norm": 0.6808873720136519,
"acc_norm_stderr": 0.013621696119173311
},
"harness|hellaswag|10": {
"acc": 0.6744672376020713,
"acc_stderr": 0.004676159299105416,
"acc_norm": 0.8607847042421828,
"acc_norm_stderr": 0.003454635760066236
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.02552503438247489,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.02552503438247489
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066496,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.029837962388291932,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.029837962388291932
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290913,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290913
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624734,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624734
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.03602814176392645,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.03602814176392645
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281376,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281376
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.013306478243066302,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.013306478243066302
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468365,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468365
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3854748603351955,
"acc_stderr": 0.016277927039638193,
"acc_norm": 0.3854748603351955,
"acc_norm_stderr": 0.016277927039638193
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.02616058445014045,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.02616058445014045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7623456790123457,
"acc_stderr": 0.02368359183700856,
"acc_norm": 0.7623456790123457,
"acc_norm_stderr": 0.02368359183700856
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4471968709256845,
"acc_stderr": 0.012698825252435106,
"acc_norm": 0.4471968709256845,
"acc_norm_stderr": 0.012698825252435106
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306032,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306032
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.46266829865361075,
"mc1_stderr": 0.01745464515097059,
"mc2": 0.6259520494051883,
"mc2_stderr": 0.014977076792645322
},
"harness|winogrande|5": {
"acc": 0.8082083662194159,
"acc_stderr": 0.011065209664659527
},
"harness|gsm8k|5": {
"acc": 0.6542835481425322,
"acc_stderr": 0.01310042299044157
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ssbuild/alpaca_dolly | ---
license: apache-2.0
---
|
mlabonne/medical-mqca-fr | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: eval
path: data/eval-*
dataset_info:
features:
- name: Specialite
dtype: string
- name: Serie
dtype: int64
- name: Question
dtype: int64
- name: N_Question
dtype: int64
- name: Answer
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4455800
num_examples: 3836
- name: eval
num_bytes: 172116
num_examples: 150
download_size: 2123478
dataset_size: 4627916
---
# Dataset Card for "medical-mqca-fr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_automerger__Experiment28Yam-7B | ---
pretty_name: Evaluation run of automerger/Experiment28Yam-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [automerger/Experiment28Yam-7B](https://huggingface.co/automerger/Experiment28Yam-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_automerger__Experiment28Yam-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-05T23:59:57.983132](https://huggingface.co/datasets/open-llm-leaderboard/details_automerger__Experiment28Yam-7B/blob/main/results_2024-04-05T23-59-57.983132.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6501524355257138,\n\
\ \"acc_stderr\": 0.032105621193819094,\n \"acc_norm\": 0.6493107160751739,\n\
\ \"acc_norm_stderr\": 0.0327807693064778,\n \"mc1\": 0.6266829865361077,\n\
\ \"mc1_stderr\": 0.016932370557570638,\n \"mc2\": 0.782582180310156,\n\
\ \"mc2_stderr\": 0.013594678008386197\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.712457337883959,\n \"acc_stderr\": 0.013226719056266129,\n\
\ \"acc_norm\": 0.7261092150170648,\n \"acc_norm_stderr\": 0.013032004972989506\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7126070503883688,\n\
\ \"acc_stderr\": 0.004516215206715357,\n \"acc_norm\": 0.8911571400119498,\n\
\ \"acc_norm_stderr\": 0.003108054563352108\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923996,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923996\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"\
acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.0303883535518868,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.0303883535518868\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590167,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590167\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553346,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553346\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457966,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903347,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903347\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546836,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546836\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4324022346368715,\n\
\ \"acc_stderr\": 0.01656897123354861,\n \"acc_norm\": 0.4324022346368715,\n\
\ \"acc_norm_stderr\": 0.01656897123354861\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.025917806117147158,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.025917806117147158\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035454,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035454\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47196870925684486,\n\
\ \"acc_stderr\": 0.012750151802922438,\n \"acc_norm\": 0.47196870925684486,\n\
\ \"acc_norm_stderr\": 0.012750151802922438\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6266829865361077,\n\
\ \"mc1_stderr\": 0.016932370557570638,\n \"mc2\": 0.782582180310156,\n\
\ \"mc2_stderr\": 0.013594678008386197\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8484609313338595,\n \"acc_stderr\": 0.010077698907571764\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6937073540561031,\n \
\ \"acc_stderr\": 0.012696930106562912\n }\n}\n```"
repo_url: https://huggingface.co/automerger/Experiment28Yam-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|arc:challenge|25_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|gsm8k|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hellaswag|10_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T23-59-57.983132.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-05T23-59-57.983132.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- '**/details_harness|winogrande|5_2024-04-05T23-59-57.983132.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-05T23-59-57.983132.parquet'
- config_name: results
data_files:
- split: 2024_04_05T23_59_57.983132
path:
- results_2024-04-05T23-59-57.983132.parquet
- split: latest
path:
- results_2024-04-05T23-59-57.983132.parquet
---
# Dataset Card for Evaluation run of automerger/Experiment28Yam-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [automerger/Experiment28Yam-7B](https://huggingface.co/automerger/Experiment28Yam-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_automerger__Experiment28Yam-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-05T23:59:57.983132](https://huggingface.co/datasets/open-llm-leaderboard/details_automerger__Experiment28Yam-7B/blob/main/results_2024-04-05T23-59-57.983132.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6501524355257138,
"acc_stderr": 0.032105621193819094,
"acc_norm": 0.6493107160751739,
"acc_norm_stderr": 0.0327807693064778,
"mc1": 0.6266829865361077,
"mc1_stderr": 0.016932370557570638,
"mc2": 0.782582180310156,
"mc2_stderr": 0.013594678008386197
},
"harness|arc:challenge|25": {
"acc": 0.712457337883959,
"acc_stderr": 0.013226719056266129,
"acc_norm": 0.7261092150170648,
"acc_norm_stderr": 0.013032004972989506
},
"harness|hellaswag|10": {
"acc": 0.7126070503883688,
"acc_stderr": 0.004516215206715357,
"acc_norm": 0.8911571400119498,
"acc_norm_stderr": 0.003108054563352108
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923996,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923996
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644237,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.0303883535518868,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.0303883535518868
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590167,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590167
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553346,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553346
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457966,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903347,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903347
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546836,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546836
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4324022346368715,
"acc_stderr": 0.01656897123354861,
"acc_norm": 0.4324022346368715,
"acc_norm_stderr": 0.01656897123354861
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.025917806117147158,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.025917806117147158
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035454,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035454
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47196870925684486,
"acc_stderr": 0.012750151802922438,
"acc_norm": 0.47196870925684486,
"acc_norm_stderr": 0.012750151802922438
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462927,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806315,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6266829865361077,
"mc1_stderr": 0.016932370557570638,
"mc2": 0.782582180310156,
"mc2_stderr": 0.013594678008386197
},
"harness|winogrande|5": {
"acc": 0.8484609313338595,
"acc_stderr": 0.010077698907571764
},
"harness|gsm8k|5": {
"acc": 0.6937073540561031,
"acc_stderr": 0.012696930106562912
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
anechaev/med_history | ---
license: mit
---
|
ucla-contextual/contextual_test | ---
configs:
- config_name: default
data_files:
- split: test
path: "contextual_test.csv"
---
---
license: mit
---
Check out the [paper](https://arxiv.org/abs/2401.13311). |
GARDA/customsmkcode | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5826
num_examples: 39
download_size: 2572
dataset_size: 5826
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "customsmkcode"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/c6b5396f | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 184
num_examples: 10
download_size: 1338
dataset_size: 184
---
# Dataset Card for "c6b5396f"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DylanJHJ/pds2023 | ---
license: apache-2.0
---
|
autoevaluate/autoeval-eval-futin__random-en-30c46b-2023566786 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/random
eval_info:
task: text_zero_shot_classification
model: facebook/opt-13b
metrics: []
dataset_name: futin/random
dataset_config: en
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-13b
* Dataset: futin/random
* Config: en
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
CyberHarem/kitazawa_shiho_theidolmstermillionlive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kitazawa_shiho/北沢志保/키타자와시호 (THE iDOLM@STER: Million Live!)
This is the dataset of kitazawa_shiho/北沢志保/키타자와시호 (THE iDOLM@STER: Million Live!), containing 500 images and their tags.
The core tags of this character are `long_hair, brown_hair, brown_eyes, breasts, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:------------|:----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 579.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kitazawa_shiho_theidolmstermillionlive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 348.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kitazawa_shiho_theidolmstermillionlive/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1170 | 727.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kitazawa_shiho_theidolmstermillionlive/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 520.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kitazawa_shiho_theidolmstermillionlive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1170 | 1014.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kitazawa_shiho_theidolmstermillionlive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kitazawa_shiho_theidolmstermillionlive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, cat_ears, cat_tail, jingle_bell, looking_at_viewer, solo, long_sleeves, paw_gloves, black_dress, blush, cleavage, fur_trim, neck_bell, purple_bow, cat_paws, simple_background, white_background, open_mouth, shiny_hair, blue_bowtie, closed_mouth, frilled_dress, paw_shoes, ribbon, yellow_eyes |
| 1 | 9 |  |  |  |  |  | 1girl, looking_at_viewer, closed_mouth, collarbone, short_sleeves, simple_background, solo, striped_shirt, upper_body, blush, white_background |
| 2 | 9 |  |  |  |  |  | 1girl, smile, solo, blush, looking_at_viewer |
| 3 | 5 |  |  |  |  |  | 1girl, cleavage, collarbone, looking_at_viewer, medium_breasts, solo, blush, floral_print, lingerie, closed_mouth, shiny_hair, smile, yellow_eyes, babydoll, bare_shoulders, detached_sleeves, on_side, panties, underwear_only |
| 4 | 20 |  |  |  |  |  | 1girl, blush, solo, navel, looking_at_viewer, nipples, medium_breasts, female_pubic_hair, collarbone, closed_mouth, completely_nude, simple_background, stomach, upper_body |
| 5 | 8 |  |  |  |  |  | blush, day, looking_at_viewer, outdoors, 1girl, medium_breasts, ocean, solo, cleavage, collarbone, closed_mouth, cloud, navel, side-tie_bikini_bottom, blue_sky, cowboy_shot, lens_flare, standing, water, wet, yellow_eyes, bare_shoulders, beach, black_bikini, frills, front-tie_bikini_top, halterneck, hand_up, smile, wading, white_bikini |
| 6 | 6 |  |  |  |  |  | 1girl, black_serafuku, looking_at_viewer, solo, black_shirt, black_skirt, red_neckerchief, black_gloves, black_sailor_collar, pleated_skirt, short_sleeves, closed_mouth, fingerless_gloves, standing |
| 7 | 5 |  |  |  |  |  | 1girl, blush, looking_at_viewer, smile, solo, dress, holding_microphone, blurry, bow, open_mouth, frilled_sleeves, hair_ribbon, juliet_sleeves, upper_body, wrist_cuffs, yellow_eyes |
| 8 | 30 |  |  |  |  |  | 1boy, 1girl, hetero, solo_focus, blush, nipples, penis, sex, sweat, vaginal, medium_breasts, open_mouth, navel, looking_at_viewer, pussy, completely_nude, female_pubic_hair, mosaic_censoring, straddling |
| 9 | 9 |  |  |  |  |  | 1girl, solo, closed_mouth, sleeveless, black_shorts, hat, looking_at_viewer, shiny_hair, short_shorts, blue_headwear, detached_sleeves, floating_hair, standing, dress, frills, striped, very_long_hair, black_sleeves, bow, cowboy_shot, long_sleeves, smile, thighhighs, yellow_eyes |
| 10 | 8 |  |  |  |  |  | 1girl, looking_at_viewer, maid_apron, maid_headdress, solo, frills, wa_maid, long_sleeves, wide_sleeves, medical_eyepatch, outdoors, black_kimono, holding_weapon, knife, night, parted_lips, sky, upper_body, white_apron |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cat_ears | cat_tail | jingle_bell | looking_at_viewer | solo | long_sleeves | paw_gloves | black_dress | blush | cleavage | fur_trim | neck_bell | purple_bow | cat_paws | simple_background | white_background | open_mouth | shiny_hair | blue_bowtie | closed_mouth | frilled_dress | paw_shoes | ribbon | yellow_eyes | collarbone | short_sleeves | striped_shirt | upper_body | smile | medium_breasts | floral_print | lingerie | babydoll | bare_shoulders | detached_sleeves | on_side | panties | underwear_only | navel | nipples | female_pubic_hair | completely_nude | stomach | day | outdoors | ocean | cloud | side-tie_bikini_bottom | blue_sky | cowboy_shot | lens_flare | standing | water | wet | beach | black_bikini | frills | front-tie_bikini_top | halterneck | hand_up | wading | white_bikini | black_serafuku | black_shirt | black_skirt | red_neckerchief | black_gloves | black_sailor_collar | pleated_skirt | fingerless_gloves | dress | holding_microphone | blurry | bow | frilled_sleeves | hair_ribbon | juliet_sleeves | wrist_cuffs | 1boy | hetero | solo_focus | penis | sex | sweat | vaginal | pussy | mosaic_censoring | straddling | sleeveless | black_shorts | hat | short_shorts | blue_headwear | floating_hair | striped | very_long_hair | black_sleeves | thighhighs | maid_apron | maid_headdress | wa_maid | wide_sleeves | medical_eyepatch | black_kimono | holding_weapon | knife | night | parted_lips | sky | white_apron |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-----------|:-----------|:--------------|:--------------------|:-------|:---------------|:-------------|:--------------|:--------|:-----------|:-----------|:------------|:-------------|:-----------|:--------------------|:-------------------|:-------------|:-------------|:--------------|:---------------|:----------------|:------------|:---------|:--------------|:-------------|:----------------|:----------------|:-------------|:--------|:-----------------|:---------------|:-----------|:-----------|:-----------------|:-------------------|:----------|:----------|:-----------------|:--------|:----------|:--------------------|:------------------|:----------|:------|:-----------|:--------|:--------|:-------------------------|:-----------|:--------------|:-------------|:-----------|:--------|:------|:--------|:---------------|:---------|:-----------------------|:-------------|:----------|:---------|:---------------|:-----------------|:--------------|:--------------|:------------------|:---------------|:----------------------|:----------------|:--------------------|:--------|:---------------------|:---------|:------|:------------------|:--------------|:-----------------|:--------------|:-------|:---------|:-------------|:--------|:------|:--------|:----------|:--------|:-------------------|:-------------|:-------------|:---------------|:------|:---------------|:----------------|:----------------|:----------|:-----------------|:----------------|:-------------|:-------------|:-----------------|:----------|:---------------|:-------------------|:---------------|:-----------------|:--------|:--------|:--------------|:------|:--------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | | | | X | X | | | | X | | | | | | X | X | | | | X | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | | | | X | X | | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | | | X | X | | | | X | X | | | | | | | | X | | X | | | | X | X | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 20 |  |  |  |  |  | X | | | | X | X | | | | X | | | | | | X | | | | | X | | | | | X | | | X | | X | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | | | | X | X | | | | X | X | | | | | | | | | | X | | | | X | X | | | | X | X | | | | X | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | | | X | X | | | | | | | | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | | | | X | X | | | | X | | | | | | | | X | | | | | | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 30 |  |  |  |  |  | X | | | | X | | | | | X | | | | | | | | X | | | | | | | | | | | | | X | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 9 |  |  |  |  |  | X | | | | X | X | X | | | | | | | | | | | | X | | X | | | | X | | | | | X | | | | | | X | | | | | | | | | | | | | | | X | | X | | | | | X | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 10 | 8 |  |  |  |  |  | X | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
czczycz/QABot | ---
license: openrail
---
|
RissoleDekejo/Bubsy | ---
license: openrail
---
|
liuyanchen1015/MULTI_VALUE_qqp_analytic_superlative | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 866229
num_examples: 5699
- name: test
num_bytes: 8686312
num_examples: 55717
- name: train
num_bytes: 7958668
num_examples: 52303
download_size: 9064404
dataset_size: 17511209
---
# Dataset Card for "MULTI_VALUE_qqp_analytic_superlative"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Expert68__llama2_13b_instructed_version2 | ---
pretty_name: Evaluation run of Expert68/llama2_13b_instructed_version2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Expert68/llama2_13b_instructed_version2](https://huggingface.co/Expert68/llama2_13b_instructed_version2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Expert68__llama2_13b_instructed_version2_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-12T19:44:39.658427](https://huggingface.co/datasets/open-llm-leaderboard/details_Expert68__llama2_13b_instructed_version2_public/blob/main/results_2023-11-12T19-44-39.658427.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5535385938067054,\n\
\ \"acc_stderr\": 0.03382379046360409,\n \"acc_norm\": 0.5616374813808622,\n\
\ \"acc_norm_stderr\": 0.034597480068222046,\n \"mc1\": 0.31456548347613217,\n\
\ \"mc1_stderr\": 0.01625524199317918,\n \"mc2\": 0.46118545589659976,\n\
\ \"mc2_stderr\": 0.015483508114692393,\n \"em\": 0.007340604026845637,\n\
\ \"em_stderr\": 0.0008741896875345934,\n \"f1\": 0.07567323825503336,\n\
\ \"f1_stderr\": 0.0016747744191590948\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5631399317406144,\n \"acc_stderr\": 0.014494421584256519,\n\
\ \"acc_norm\": 0.6006825938566553,\n \"acc_norm_stderr\": 0.014312094557946705\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6412069308902609,\n\
\ \"acc_stderr\": 0.004786660691181909,\n \"acc_norm\": 0.8404700258912567,\n\
\ \"acc_norm_stderr\": 0.003654212329516619\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874142,\n\
\ \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874142\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5924528301886792,\n \"acc_stderr\": 0.030242233800854494,\n\
\ \"acc_norm\": 0.5924528301886792,\n \"acc_norm_stderr\": 0.030242233800854494\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.625,\n\
\ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.625,\n \
\ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364397,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364397\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.451063829787234,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.32275132275132273,\n \"acc_stderr\": 0.024078943243597016,\n \"\
acc_norm\": 0.32275132275132273,\n \"acc_norm_stderr\": 0.024078943243597016\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.042857142857142816,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.042857142857142816\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6483870967741936,\n\
\ \"acc_stderr\": 0.02716253782694846,\n \"acc_norm\": 0.6483870967741936,\n\
\ \"acc_norm_stderr\": 0.02716253782694846\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43842364532019706,\n \"acc_stderr\": 0.03491207857486518,\n\
\ \"acc_norm\": 0.43842364532019706,\n \"acc_norm_stderr\": 0.03491207857486518\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091707,\n\
\ \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091707\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6717171717171717,\n \"acc_stderr\": 0.03345678422756776,\n \"\
acc_norm\": 0.6717171717171717,\n \"acc_norm_stderr\": 0.03345678422756776\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5153846153846153,\n \"acc_stderr\": 0.025339003010106515,\n\
\ \"acc_norm\": 0.5153846153846153,\n \"acc_norm_stderr\": 0.025339003010106515\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073835,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073835\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.592436974789916,\n \"acc_stderr\": 0.031918633744784645,\n \
\ \"acc_norm\": 0.592436974789916,\n \"acc_norm_stderr\": 0.031918633744784645\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.728440366972477,\n \"acc_stderr\": 0.019069098363191428,\n \"\
acc_norm\": 0.728440366972477,\n \"acc_norm_stderr\": 0.019069098363191428\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.03372343271653064,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.03372343271653064\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.02933116229425174,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02933116229425174\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.04328577215262971,\n\
\ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.04328577215262971\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.041391127276354626,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.041391127276354626\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326467,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326467\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.025140935950335442,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.025140935950335442\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7637292464878672,\n\
\ \"acc_stderr\": 0.01519047371703751,\n \"acc_norm\": 0.7637292464878672,\n\
\ \"acc_norm_stderr\": 0.01519047371703751\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.02607431485165708,\n\
\ \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.02607431485165708\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4011173184357542,\n\
\ \"acc_stderr\": 0.01639222189940707,\n \"acc_norm\": 0.4011173184357542,\n\
\ \"acc_norm_stderr\": 0.01639222189940707\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5915032679738562,\n \"acc_stderr\": 0.028146405993096358,\n\
\ \"acc_norm\": 0.5915032679738562,\n \"acc_norm_stderr\": 0.028146405993096358\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n\
\ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n\
\ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6604938271604939,\n \"acc_stderr\": 0.026348564412011624,\n\
\ \"acc_norm\": 0.6604938271604939,\n \"acc_norm_stderr\": 0.026348564412011624\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.43617021276595747,\n \"acc_stderr\": 0.02958345203628407,\n \
\ \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.02958345203628407\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4380704041720991,\n\
\ \"acc_stderr\": 0.01267190278256765,\n \"acc_norm\": 0.4380704041720991,\n\
\ \"acc_norm_stderr\": 0.01267190278256765\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5330882352941176,\n \"acc_stderr\": 0.03030625772246831,\n\
\ \"acc_norm\": 0.5330882352941176,\n \"acc_norm_stderr\": 0.03030625772246831\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5751633986928104,\n \"acc_stderr\": 0.01999797303545833,\n \
\ \"acc_norm\": 0.5751633986928104,\n \"acc_norm_stderr\": 0.01999797303545833\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03136250240935893,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03136250240935893\n },\n\
\ \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n\
\ \"acc_stderr\": 0.031871875379197966,\n \"acc_norm\": 0.7164179104477612,\n\
\ \"acc_norm_stderr\": 0.031871875379197966\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31456548347613217,\n\
\ \"mc1_stderr\": 0.01625524199317918,\n \"mc2\": 0.46118545589659976,\n\
\ \"mc2_stderr\": 0.015483508114692393\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7561168113654302,\n \"acc_stderr\": 0.012068923278908194\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.007340604026845637,\n \
\ \"em_stderr\": 0.0008741896875345934,\n \"f1\": 0.07567323825503336,\n\
\ \"f1_stderr\": 0.0016747744191590948\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.10993176648976498,\n \"acc_stderr\": 0.008616195587865397\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Expert68/llama2_13b_instructed_version2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|arc:challenge|25_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|drop|3_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|gsm8k|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hellaswag|10_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-12T19-44-39.658427.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-12T19-44-39.658427.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- '**/details_harness|winogrande|5_2023-11-12T19-44-39.658427.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-12T19-44-39.658427.parquet'
- config_name: results
data_files:
- split: 2023_11_12T19_44_39.658427
path:
- results_2023-11-12T19-44-39.658427.parquet
- split: latest
path:
- results_2023-11-12T19-44-39.658427.parquet
---
# Dataset Card for Evaluation run of Expert68/llama2_13b_instructed_version2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Expert68/llama2_13b_instructed_version2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Expert68/llama2_13b_instructed_version2](https://huggingface.co/Expert68/llama2_13b_instructed_version2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Expert68__llama2_13b_instructed_version2_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-12T19:44:39.658427](https://huggingface.co/datasets/open-llm-leaderboard/details_Expert68__llama2_13b_instructed_version2_public/blob/main/results_2023-11-12T19-44-39.658427.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5535385938067054,
"acc_stderr": 0.03382379046360409,
"acc_norm": 0.5616374813808622,
"acc_norm_stderr": 0.034597480068222046,
"mc1": 0.31456548347613217,
"mc1_stderr": 0.01625524199317918,
"mc2": 0.46118545589659976,
"mc2_stderr": 0.015483508114692393,
"em": 0.007340604026845637,
"em_stderr": 0.0008741896875345934,
"f1": 0.07567323825503336,
"f1_stderr": 0.0016747744191590948
},
"harness|arc:challenge|25": {
"acc": 0.5631399317406144,
"acc_stderr": 0.014494421584256519,
"acc_norm": 0.6006825938566553,
"acc_norm_stderr": 0.014312094557946705
},
"harness|hellaswag|10": {
"acc": 0.6412069308902609,
"acc_stderr": 0.004786660691181909,
"acc_norm": 0.8404700258912567,
"acc_norm_stderr": 0.003654212329516619
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5460526315789473,
"acc_stderr": 0.04051646342874142,
"acc_norm": 0.5460526315789473,
"acc_norm_stderr": 0.04051646342874142
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5924528301886792,
"acc_stderr": 0.030242233800854494,
"acc_norm": 0.5924528301886792,
"acc_norm_stderr": 0.030242233800854494
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.625,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364397,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364397
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.451063829787234,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.451063829787234,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.32275132275132273,
"acc_stderr": 0.024078943243597016,
"acc_norm": 0.32275132275132273,
"acc_norm_stderr": 0.024078943243597016
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.042857142857142816,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.042857142857142816
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6483870967741936,
"acc_stderr": 0.02716253782694846,
"acc_norm": 0.6483870967741936,
"acc_norm_stderr": 0.02716253782694846
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43842364532019706,
"acc_stderr": 0.03491207857486518,
"acc_norm": 0.43842364532019706,
"acc_norm_stderr": 0.03491207857486518
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03588624800091707,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03588624800091707
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6717171717171717,
"acc_stderr": 0.03345678422756776,
"acc_norm": 0.6717171717171717,
"acc_norm_stderr": 0.03345678422756776
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548057,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548057
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5153846153846153,
"acc_stderr": 0.025339003010106515,
"acc_norm": 0.5153846153846153,
"acc_norm_stderr": 0.025339003010106515
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073835,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073835
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.592436974789916,
"acc_stderr": 0.031918633744784645,
"acc_norm": 0.592436974789916,
"acc_norm_stderr": 0.031918633744784645
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.728440366972477,
"acc_stderr": 0.019069098363191428,
"acc_norm": 0.728440366972477,
"acc_norm_stderr": 0.019069098363191428
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.03372343271653064,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.03372343271653064
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.02933116229425174,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.02933116229425174
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.04328577215262971,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.04328577215262971
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.041391127276354626,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.041391127276354626
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326467,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326467
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.025140935950335442,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.025140935950335442
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7637292464878672,
"acc_stderr": 0.01519047371703751,
"acc_norm": 0.7637292464878672,
"acc_norm_stderr": 0.01519047371703751
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.02607431485165708,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.02607431485165708
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4011173184357542,
"acc_stderr": 0.01639222189940707,
"acc_norm": 0.4011173184357542,
"acc_norm_stderr": 0.01639222189940707
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5915032679738562,
"acc_stderr": 0.028146405993096358,
"acc_norm": 0.5915032679738562,
"acc_norm_stderr": 0.028146405993096358
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6604938271604939,
"acc_stderr": 0.026348564412011624,
"acc_norm": 0.6604938271604939,
"acc_norm_stderr": 0.026348564412011624
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.02958345203628407,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.02958345203628407
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4380704041720991,
"acc_stderr": 0.01267190278256765,
"acc_norm": 0.4380704041720991,
"acc_norm_stderr": 0.01267190278256765
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5330882352941176,
"acc_stderr": 0.03030625772246831,
"acc_norm": 0.5330882352941176,
"acc_norm_stderr": 0.03030625772246831
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5751633986928104,
"acc_stderr": 0.01999797303545833,
"acc_norm": 0.5751633986928104,
"acc_norm_stderr": 0.01999797303545833
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6,
"acc_stderr": 0.03136250240935893,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03136250240935893
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.031871875379197966,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.031871875379197966
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31456548347613217,
"mc1_stderr": 0.01625524199317918,
"mc2": 0.46118545589659976,
"mc2_stderr": 0.015483508114692393
},
"harness|winogrande|5": {
"acc": 0.7561168113654302,
"acc_stderr": 0.012068923278908194
},
"harness|drop|3": {
"em": 0.007340604026845637,
"em_stderr": 0.0008741896875345934,
"f1": 0.07567323825503336,
"f1_stderr": 0.0016747744191590948
},
"harness|gsm8k|5": {
"acc": 0.10993176648976498,
"acc_stderr": 0.008616195587865397
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
OdiaGenAI/roleplay_english | ---
task_categories:
- question-answering
- conversational
language:
- en
size_categories:
- 1K<n<10K
--- |
nicholasbien/lakh-txt-full-v2-gpt2-tokenized | ---
dataset_info:
features:
- name: text
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 1554119120
num_examples: 13560
- name: test
num_bytes: 385501195
num_examples: 3390
download_size: 662698287
dataset_size: 1939620315
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
sugeun/summaryTest | ---
license: apache-2.0
---
|
cvzion/dqg-dataset-v4-final | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 71406
num_examples: 131
download_size: 28648
dataset_size: 71406
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BubbleJoe/mscoco_simplified | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: restval
path: data/restval-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: sentids
dtype: int64
- name: sentences
dtype: string
- name: simplified
dtype: string
splits:
- name: train
num_bytes: 37370437
num_examples: 414113
- name: test
num_bytes: 2252431
num_examples: 25010
- name: restval
num_bytes: 13747474
num_examples: 152634
- name: validation
num_bytes: 2254719
num_examples: 25010
download_size: 29875182
dataset_size: 55625061
---
# Dataset Card for "mscoco_simplified"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
akorson/allenai | ---
license: openrail
---
|
CyberHarem/nyotengu_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of nyotengu/女天狗/女天狗 (Azur Lane)
This is the dataset of nyotengu/女天狗/女天狗 (Azur Lane), containing 191 images and their tags.
The core tags of this character are `black_hair, breasts, large_breasts, long_hair, mole, purple_eyes, mole_under_mouth, bangs, blunt_bangs, hime_cut, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 191 | 231.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nyotengu_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 191 | 143.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nyotengu_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 434 | 281.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nyotengu_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 191 | 211.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nyotengu_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 434 | 380.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nyotengu_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nyotengu_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 28 |  |  |  |  |  | 1girl, solo, detached_sleeves, looking_at_viewer, cleavage, bare_shoulders, black_wings, tokin_hat, hand_fan, feathered_wings, makeup, smile, kimono, tengu, sash, hauchiwa, tongue_out, lips |
| 1 | 22 |  |  |  |  |  | 1girl, looking_at_viewer, solo, cleavage, bare_shoulders, navel, parted_lips, smile, o-ring_bikini, water, fingerless_gloves, black_bikini, collarbone, wet |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | detached_sleeves | looking_at_viewer | cleavage | bare_shoulders | black_wings | tokin_hat | hand_fan | feathered_wings | makeup | smile | kimono | tengu | sash | hauchiwa | tongue_out | lips | navel | parted_lips | o-ring_bikini | water | fingerless_gloves | black_bikini | collarbone | wet |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-------------------|:--------------------|:-----------|:-----------------|:--------------|:------------|:-----------|:------------------|:---------|:--------|:---------|:--------|:-------|:-----------|:-------------|:-------|:--------|:--------------|:----------------|:--------|:--------------------|:---------------|:-------------|:------|
| 0 | 28 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 1 | 22 |  |  |  |  |  | X | X | | X | X | X | | | | | | X | | | | | | | X | X | X | X | X | X | X | X |
|
FarAwayFer/gua-llama-ofan | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1656684
num_examples: 1008
download_size: 970097
dataset_size: 1656684
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "gua-llama-ofan"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
YuehHanChen/sst2_finetuning_dataset | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: answer
dtype: int64
splits:
- name: train
num_bytes: 11574670
num_examples: 68221
download_size: 0
dataset_size: 11574670
---
# Dataset Card for "sst2_finetuning_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
slayone/uva_spoj_raw | ---
license: mit
task_categories:
- translation
- text-generation
tags:
- code
size_categories:
- 1K<n<10K
--- |
udayl/rocks | ---
license: mit
---
Rocks dataset with 7 classes: [Coal, Limestone, Marble, Sandstone, Quartzite, Basalt, Granite]
|
DONG19/instruct_code_search_net | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1974467839.5874639
num_examples: 1629456
- name: validation
num_bytes: 93239061.8591426
num_examples: 76505
- name: test
num_bytes: 104426710.35366909
num_examples: 87036
download_size: 652473629
dataset_size: 2172133611.8002753
---
# Dataset Card for "instruct_code_search_net"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BangumiBase/akatsukinoyona | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Akatsuki No Yona
This is the image base of bangumi Akatsuki no Yona, we detected 41 characters, 3412 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 532 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 33 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 76 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 69 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 39 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 34 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 18 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 213 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 46 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 207 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 29 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 58 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 50 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 60 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 35 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 58 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 28 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 15 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 15 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 230 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 57 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 22 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 85 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 31 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 21 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 25 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 9 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 21 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 797 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 77 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 11 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 7 | [Download](31/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 32 | 14 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 26 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 41 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 14 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 6 | [Download](36/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 37 | 14 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 9 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 46 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 234 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
Brizape/tmvar_split_0404_dev | ---
dataset_info:
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
- name: texts
dtype: string
splits:
- name: train
num_bytes: 1614127.7155688624
num_examples: 801
- name: validation
num_bytes: 405043.2844311377
num_examples: 201
- name: test
num_bytes: 977708
num_examples: 498
download_size: 883485
dataset_size: 2996879.0
---
# Dataset Card for "tmvar_split_0404_dev"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/VALUE_cola_dey_it | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 1545
num_examples: 22
- name: test
num_bytes: 1597
num_examples: 21
- name: train
num_bytes: 10190
num_examples: 146
download_size: 12452
dataset_size: 13332
---
# Dataset Card for "VALUE_cola_dey_it"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jonathanasdf/MathGLM-dataset-500k | ---
license: afl-3.0
---
Every 100th row from https://github.com/THUDM/MathGLM (original dataset has 50M entries) |
zolak/twitter_dataset_81_1713184381 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 228697
num_examples: 552
download_size: 122639
dataset_size: 228697
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
maximedb/natural_questions | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 10087609
num_examples: 130233
- name: validation
num_bytes: 714323
num_examples: 8643
download_size: 6827128
dataset_size: 10801932
---
# Dataset Card for "natural_questions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
madha98/Shakespeare | ---
license: mit
---
|
sheik21/voz-ronaldo | ---
license: openrail
---
|
RamonPereira/minhavoz98 | ---
license: openrail
---
|
Ramitha/open-australian-legal-qa-test-analysis | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: original_texts
sequence: int64
- name: is_knowledge_available
dtype: string
- name: llm_knowledge_document
dtype: string
splits:
- name: test
num_bytes: 55142
num_examples: 35
download_size: 41463
dataset_size: 55142
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
liuyanchen1015/MULTI_VALUE_mrpc_your_yalls | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 895
num_examples: 3
- name: train
num_bytes: 1605
num_examples: 6
- name: validation
num_bytes: 225
num_examples: 1
download_size: 12395
dataset_size: 2725
---
# Dataset Card for "MULTI_VALUE_mrpc_your_yalls"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CamiloVega/Llama2-jobsedcription-requirement | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 3964110
num_examples: 500
download_size: 1691632
dataset_size: 3964110
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
m720/SHADR | ---
license: cc-by-4.0
task_categories:
- text-classification
language:
- en
tags:
- medical
pretty_name: SHADR
size_categories:
- 1K<n<10K
---
# SDoH Human Annotated Demographic Robustness (SHADR) Dataset
## Overview
The Social determinants of health (SDoH) play a pivotal role in determining patient outcomes. However, their documentation in electronic health records (EHR) remains incomplete. This dataset was created from a study examining the capability of large language models in extracting SDoH from the free text sections of EHRs. Furthermore, the study delved into the potential of synthetic clinical text to bolster the extraction process of these scarcely documented, yet crucial, clinical data.
## Dataset Structure & Modification
To understand potential biases in high-performing models and in those pre-trained on general text, GPT-4 was utilized to infuse demographic descriptors into our synthetic data.
For instance:
- **Original Sentence**: "Widower admits fears surrounding potential judgment…"
- **Modified Sentence**: “Hispanic widower admits fears surrounding potential judgment..."
Such demographic-infused sentences underwent manual validation. Out of these:
- 419 had mentions of SDoH
- 253 had mentions of adverse SDoH
- The remainder were tagged as NO_SDoH
## Instructions for Model Evaluation
1. Initially, run your model inference on the original sentences.
2. Subsequently, apply the same model to infer on the demographic-modified sentences.
3. Perform comparisons for robustness.
For a detailed understanding of the "adverse" labeling, refer to https://arxiv.org/pdf/2308.06354.pdf. Here, the 'adverse' column demarcates if the label corresponds to an "adverse" or "non-adverse" SDoH.
## Current Performance Metrics
- **Best Model Performance**:
- **Any SDoH**: 88% Macro-F1
- **Adverse SDoH**: 84% Macro-F1
- **Robustness Rate**:
- **Any SDoH**: 9.9%
- **Adverse SDoH**: 14.3%
## External Links
- A PhysioNet release of our annotated MIMIC-III courpus: https://physionet.org/content/annotation-dataset-sdoh/1.0.0/
- Github release: https://github.com/AIM-Harvard/SDoH
---
How to Cite:
```
@misc{guevara2023large,
title={Large Language Models to Identify Social Determinants of Health in Electronic Health Records},
author={Marco Guevara and Shan Chen and Spencer Thomas and Tafadzwa L. Chaunzwa and Idalid Franco and Benjamin Kann and Shalini Moningi and Jack Qian and Madeleine Goldstein and Susan Harper and Hugo JWL Aerts and Guergana K. Savova and Raymond H. Mak and Danielle S. Bitterman},
year={2023},
eprint={2308.06354},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
vikp/code_with_explanations | ---
license: cc-by-4.0
dataset_info:
features:
- name: text
dtype: string
- name: kind
dtype: string
splits:
- name: train
num_bytes: 18084855022
num_examples: 959307
download_size: 6281752624
dataset_size: 18084855022
---
|
asas-ai/wiki_completion | ---
dataset_info:
features:
- name: id
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 2971489519
num_examples: 1225880
download_size: 1314634499
dataset_size: 2971489519
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
practical-dreamer/RPGPT_PublicDomain-alpaca | ---
license: mit
task_categories:
- conversational
language:
- en
tags:
- alpaca
pretty_name: rpgpt-alpaca
size_categories:
- 10M<n<100M
---
Experimental Synthetic Dataset of Public Domain Character Dialogue in Roleplay Format
Generated using scripts from my https://github.com/practicaldreamer/build-a-dataset repo
---
license: mit
--- |
Dippi9845/arxiv-fragments-generated | ---
license: cc-by-nc-sa-4.0
---
|
metro1/databricks-custom-dataset | ---
license: cc-by-sa-4.0
---
|
tomaarsen/setfit-absa-semeval-laptops | ---
dataset_info:
features:
- name: text
dtype: string
- name: span
dtype: string
- name: label
dtype: string
- name: ordinal
dtype: int64
splits:
- name: train
num_bytes: 335243
num_examples: 2358
- name: test
num_bytes: 76698
num_examples: 654
download_size: 146971
dataset_size: 411941
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "tomaarsen/setfit-absa-semeval-laptops"
### Dataset Summary
This dataset contains the manually annotated laptop reviews from SemEval-2014 Task 4, in the format as
understood by [SetFit](https://github.com/huggingface/setfit) ABSA.
For more details, see https://aclanthology.org/S14-2004/
### Data Instances
An example of "train" looks as follows.
```json
{"text": "I charge it at night and skip taking the cord with me because of the good battery life.", "span": "cord", "label": "neutral", "ordinal": 0}
{"text": "I charge it at night and skip taking the cord with me because of the good battery life.", "span": "battery life", "label": "positive", "ordinal": 0}
{"text": "The tech guy then said the service center does not do 1-to-1 exchange and I have to direct my concern to the \"sales\" team, which is the retail shop which I bought my netbook from.", "span": "service center", "label": "negative", "ordinal": 0}
{"text": "The tech guy then said the service center does not do 1-to-1 exchange and I have to direct my concern to the \"sales\" team, which is the retail shop which I bought my netbook from.", "span": "\"sales\" team", "label": "negative", "ordinal": 0}
{"text": "The tech guy then said the service center does not do 1-to-1 exchange and I have to direct my concern to the \"sales\" team, which is the retail shop which I bought my netbook from.", "span": "tech guy", "label": "neutral", "ordinal": 0}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
- `span`: a `string` feature showing the aspect span from the text.
- `label`: a `string` feature showing the polarity of the aspect span.
- `ordinal`: an `int64` feature showing the n-th occurrence of the span in the text. This is useful for if the span occurs within the same text multiple times.
### Data Splits
| name |train|test|
|---------|----:|---:|
|tomaarsen/setfit-absa-semeval-laptops|2358|654|
### Training ABSA models using SetFit ABSA
To train using this dataset, first install the SetFit library:
```bash
pip install setfit
```
And then you can use the following script as a guideline of how to train an ABSA model on this dataset:
```python
from setfit import AbsaModel, AbsaTrainer, TrainingArguments
from datasets import load_dataset
from transformers import EarlyStoppingCallback
# You can initialize a AbsaModel using one or two SentenceTransformer models, or two ABSA models
model = AbsaModel.from_pretrained("sentence-transformers/all-MiniLM-L6-v2")
# The training/eval dataset must have `text`, `span`, `polarity`, and `ordinal` columns
dataset = load_dataset("tomaarsen/setfit-absa-semeval-laptops")
train_dataset = dataset["train"]
eval_dataset = dataset["test"]
args = TrainingArguments(
output_dir="models",
use_amp=True,
batch_size=256,
eval_steps=50,
save_steps=50,
load_best_model_at_end=True,
)
trainer = AbsaTrainer(
model,
args=args,
train_dataset=train_dataset,
eval_dataset=eval_dataset,
callbacks=[EarlyStoppingCallback(early_stopping_patience=5)],
)
trainer.train()
metrics = trainer.evaluate(eval_dataset)
print(metrics)
trainer.push_to_hub("tomaarsen/setfit-absa-laptops")
```
You can then run inference like so:
```python
from setfit import AbsaModel
# Download from Hub and run inference
model = AbsaModel.from_pretrained(
"tomaarsen/setfit-absa-laptops-aspect",
"tomaarsen/setfit-absa-laptops-polarity",
)
# Run inference
preds = model([
"Boots up fast and runs great!",
"The screen shows great colors.",
])
```
### Citation Information
```bibtex
@inproceedings{pontiki-etal-2014-semeval,
title = "{S}em{E}val-2014 Task 4: Aspect Based Sentiment Analysis",
author = "Pontiki, Maria and
Galanis, Dimitris and
Pavlopoulos, John and
Papageorgiou, Harris and
Androutsopoulos, Ion and
Manandhar, Suresh",
editor = "Nakov, Preslav and
Zesch, Torsten",
booktitle = "Proceedings of the 8th International Workshop on Semantic Evaluation ({S}em{E}val 2014)",
month = aug,
year = "2014",
address = "Dublin, Ireland",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/S14-2004",
doi = "10.3115/v1/S14-2004",
pages = "27--35",
}
``` |
mcemilg/news-cat | ---
task_categories:
- text-classification
language:
- tr
---
Homepage: http://www.kemik.yildiz.edu.tr/veri_kumelerimiz.html |
mmeberg/PyVulDet-NER | ---
task_categories:
- token-classification
language:
- en
tags:
- code
---
The data in this datasets repository is associated with the following NER models to identify 6 vulnerability types in Python source code:
https://huggingface.co/mmeberg/RoRo_PyVulDet_NER
https://huggingface.co/mmeberg/RoCo_PyVulDet_NER
https://huggingface.co/mmeberg/DiDi_PyVulDet_NER
https://huggingface.co/mmeberg/CoRo_PyVulDet_NER
https://huggingface.co/mmeberg/CoCo_PyVulDet_NER
In addition, a manuscript paper has been submitted detailing this work to the DevSecOps: Advances for Secure Software Development special issue in Computers & Security.
This research is part of an in-progess dissertation for George Washington University. |
pbaoo2705/covidqa_processed | ---
dataset_info:
features:
- name: context
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: start_positions
dtype: int64
- name: end_positions
dtype: int64
splits:
- name: train
num_bytes: 6915408
num_examples: 1960
download_size: 1791787
dataset_size: 6915408
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "covidqa_processed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vwxyzjn/openhermes-dev_combined__1708357359 | ---
dataset_info:
features:
- name: source
dtype: string
- name: category
dtype: string
- name: prompt
dtype: string
- name: candidate0
list:
- name: content
dtype: string
- name: role
dtype: string
- name: candidate0_policy
dtype: string
- name: candidate1
list:
- name: content
dtype: string
- name: role
dtype: string
- name: candidate1_policy
dtype: string
- name: candidate2
list:
- name: content
dtype: string
- name: role
dtype: string
- name: candidate2_policy
dtype: string
splits:
- name: train
num_bytes: 1062480
num_examples: 200
download_size: 616725
dataset_size: 1062480
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
STAM/agricore | ---
license: mit
---
|
joey234/mmlu-professional_law-verbal-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 2471871
num_examples: 1534
download_size: 1367759
dataset_size: 2471871
---
# Dataset Card for "mmlu-professional_law-verbal-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Instander/instander-apk | ---
license: openrail
---
|
tyzhu/random_letter_find_passage_train10_eval10_num | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 2514
num_examples: 30
- name: validation
num_bytes: 1120
num_examples: 10
download_size: 5250
dataset_size: 3634
---
# Dataset Card for "random_letter_find_passage_train10_eval10_num"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmavkgo/whisper_medium_ptt | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 202659352
num_examples: 211
- name: test
num_bytes: 25932248
num_examples: 27
- name: valid
num_bytes: 24972360
num_examples: 26
download_size: 35633385
dataset_size: 253563960
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
macrocosm/arxiv_titles | ---
license: mit
language:
- en
size_categories:
- 1M<n<10M
---
All 2.3 million papers in the Arxiv, embedded via title with the InstructorXL model.
No claims are made about the copyright or license of contained materials. We assume no responsibilty for and are not liable under any circumstances for damages. Use at your own risk.
Good luck, have fun. |
Parikshith/snli_translated_en_fr | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 61069688
num_examples: 549367
- name: validation
num_bytes: 1102423
num_examples: 9842
- name: test
num_bytes: 1097712
num_examples: 9824
download_size: 20266943
dataset_size: 63269823
---
# Dataset Card for "snli_translated_en_fr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
charris/wav2vec2_processed_spotify | ---
dataset_info:
features:
- name: input_values
sequence: float32
- name: input_length
dtype: float64
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 1705098062.7692308
num_examples: 936
- name: test
num_bytes: 365184849.9604743
num_examples: 192
- name: dev
num_bytes: 357809300.62992126
num_examples: 197
download_size: 2244849910
dataset_size: 2428092213.3596263
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: dev
path: data/dev-*
---
|
marcones/sertanejo | ---
license: openrail
---
|
CyberHarem/ultimate_madoka_mahoushoujomadokamagica | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ultimate_madoka (Mahou Shoujo Madoka☆Magica)
This is the dataset of ultimate_madoka (Mahou Shoujo Madoka☆Magica), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
|
Falah/story44kids_0_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 3026
num_examples: 13
download_size: 3674
dataset_size: 3026
---
# Dataset Card for "story44kids_0_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
llm-jp/oasst1-21k-ja | ---
license: apache-2.0
language:
- ja
size_categories:
- 10K<n<100K
---
# oasst1-21k-ja
This repository provides an instruction tuning dataset developed by [LLM-jp](https://llm-jp.nii.ac.jp/), a collaborative project launched in Japan.
This dataset is a Japanese translation of an English subset of [oasst1](https://huggingface.co/datasets/OpenAssistant/oasst1) using DeepL.
English subset is [here](https://huggingface.co/datasets/llm-jp/oasst1-21k-en).
## Send Questions to
llm-jp(at)nii.ac.jp
## Model Card Authors
*The names are listed in alphabetical order.*
Hirokazu Kiyomaru, Hiroshi Matsuda, Jun Suzuki, Namgi Han, Saku Sugawara, Shota Sasaki, Shuhei Kurita, Taishi Nakamura, Takashi Kodama, Takumi Okamoto. |
joaovitor2763/autotrain-data-llama-call-sdr | ---
language:
- pt
task_categories:
- summarization
---
# AutoTrain Dataset for project: llama-call-sdr
## Dataset Description
This dataset has been automatically processed by AutoTrain for project llama-call-sdr.
### Languages
The BCP-47 code for the dataset's language is pt.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "Sim. Al\u00f4, Luiz? Oi. Fala, Luiz. Aqui quem fala \u00e9 o Santiago, aqui do G4, a gente conversou essa semana, tudo bem? Opa, amigo. Tudo certo. E voc\u00ea? Claro. Tudo \u00f3timo por aqui tamb\u00e9m, Luiz. Voc\u00ea est\u00e1 podendo falar agora comigo, n\u00e9? Sim. \u00d3timo. Luiz, como \u00e9 bem comentado na nossa liga\u00e7\u00e3o de ontem, ontem-ontem, na verdade, o intuito aqui, na verdade, \u00e9 entender um pouquinho do seu momento na empresa, vi que hoje voc\u00ea \u00e9 o s\u00f3cio fundador da A View Professional, n\u00e9, e a\u00ed eu vi que voc\u00ea est\u00e1 buscando para uma imers\u00e3o de vendas, n\u00e9, atualmente eu vi que, n\u00e3o sei se a sua empresa, ela atua no mercado de, no mercado de, \u00e9 de xampu, no caso, n\u00e3o sei se \u00e9, de peste? Sim, sim, \u00e9 xampu, tratamentos, aliciamentos, informol, tudo que tem a ver com o p\u00fablico, com cabeleireiros. Legal. Ent\u00e3o voc\u00eas atuam nesse setor atualmente. Bacana. E hoje eu gostaria, assim, de entender melhor um pouco da situa\u00e7\u00e3o que voc\u00ea se encontra, n\u00e9, at\u00e9 para ver se o G4 pode agregar o valor, n\u00e9, aos desafios que voc\u00ea enfrenta a\u00ed na sua empresa. Voc\u00ea consegue me dar um overview r\u00e1pido, at\u00e9 para a gente iniciar essa conversa, Luiz? Sim. Sim, n\u00f3s temos, n\u00f3s temos algumas opera\u00e7\u00f5es na regi\u00e3o, temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil. Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Que a\u00ed, no Brasil, n\u00f3s temos a opera\u00e7\u00e3o do Brasil, t\u00e1? Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham. Aham.\n",
"target": "- Nome do cliente: Luiz\n- Cargo do cliente: S\u00f3cio fundador\n- Nome da empresa: A View Professional\n- Setor de atua\u00e7\u00e3o da empresa: O mercado de produtos para cabeleireiros (xampu, tratamentos, aliciamentos, informol e outros). N\u00e3o h\u00e1 detalhes espec\u00edficos al\u00e9m disso.\n- Quanto a empresa fatura anualmente: Informa\u00e7\u00e3o n\u00e3o abordada na call\n- Quantos funcion\u00e1rios a empresa possu\u00ed: Informa\u00e7\u00e3o n\u00e3o abordada na call\n- Como conheceu o G4: Informa\u00e7\u00e3o n\u00e3o abordada na call\n- A pessoa tem algum amigo, conhecido ou s\u00f3cio que j\u00e1 foi aluno ou cliente do G4: Informa\u00e7\u00e3o n\u00e3o abordada na call\n- O que motivou a pessoa a buscar o G4 no atual momento: Informa\u00e7\u00e3o n\u00e3o abordada na call\n- J\u00e1 consumiu algum produto do G4: Informa\u00e7\u00e3o n\u00e3o abordada na call\n- Qual a principal dor apresentada pelo lead: Informa\u00e7\u00e3o n\u00e3o abordada na call\n- Quais os principais desafios apresentados pelo lead: Informa\u00e7\u00e3o n\u00e3o abordada na call\n- Foi informado ao lead que a imers\u00e3o \u00e9 presencial em S\u00e3o Paulo: N\u00e3o\n- Algum outro coment\u00e1rio \u00fatil para o time comercial: A entrevista carece de informa\u00e7\u00e3o substancial devido \u00e0 repeti\u00e7\u00e3o excessiva de uma declara\u00e7\u00e3o (\"n\u00f3s temos a opera\u00e7\u00e3o do Brasil\"). Isso deve ser verificado e corrigido para uma compreens\u00e3o mais clara da situa\u00e7\u00e3o do cliente."
},
{
"text": "Eu vi que voc\u00ea t\u00e1 com interesse, demonstrando interesse em uma das nossas imers\u00f5es presenciais, t\u00e1 certa? Isso. Perfeito. Eu vi aqui que voc\u00ea hoje \u00e9 CEO, presidente, \u00e9 da sua pr\u00f3pria empresa? Isso. Qual que \u00e9 o nome dessa empresa, Sidney? Sid Rodas. Rodas de carro. Sid Rodas. Sid Rodas, perfeito. Hoje voc\u00ea realmente vende rodas para carros ou abrange mais alguma coisa? Rodas e pneus. Rodas e pneus, perfeito. O que eu tenho pra te perguntar, hoje, como \u00e9 que voc\u00ea chegou at\u00e9 a gente G4? Eu sei que voc\u00ea t\u00e1 namorando a gente aqui faz um tempinho, gostaria de entender um pouco como voc\u00ea chegou at\u00e9 a gente, n\u00e9? O que voc\u00ea estava procurando na internet, j\u00e1 conhece os fundadores, como \u00e9 que \u00e9? Uma empresa de pneu que eu represento \u00e9 a Delint, GP Imports. Ele \u00e9 amigo do dono da G4, eles estudam em Harvard, juntos, faz curso l\u00e1. Ah, que legal. E ele que te apresentou o G4, \u00e9 isso? Ele que te indicou? Isso. Perfeito. Hoje o seu servi\u00e7o \u00e9 apenas rodas e pneus. A sua posi\u00e7\u00e3o hoje como presidente, ou voc\u00ea t\u00e1 num lugar mais estrat\u00e9gico, ou voc\u00ea ainda est\u00e1 dentro da produ\u00e7\u00e3o? Como \u00e9 que \u00e9 a sua fun\u00e7\u00e3o? Na produ\u00e7\u00e3o a milh\u00e3o. Na produ\u00e7\u00e3o a milh\u00e3o, sei como \u00e9 que \u00e9. \u00c9 horr\u00edvel, mas t\u00e1 errado, mas n\u00e3o tem o que fazer. Imagino. Quando voc\u00ea procurou nossas imers\u00f5es, o que que \u00e9? Realmente voc\u00ea quer tentar sair da produ\u00e7\u00e3o, quer conseguir isso, quer tra\u00e7ar um plano pra isso? Ou \u00e9 outra coisa, \u00e9 referente \u00e0s vendas? O que que t\u00e1 te tirando o sono a\u00ed que voc\u00ea t\u00e1 procurando a imers\u00e3o? Ah, na verdade eu queria me encontrar, n\u00e9? Na verdade a gente cresceu um pouquinho e eu me encontro perdido no meio da caminhada. Ent\u00e3o, por exemplo, o que eu t\u00f4 fazendo hoje t\u00e1 errado. Eu t\u00f4 perdendo neg\u00f3cios por n\u00e3o saber delegar, sabe? Entendi. Ent\u00e3o voc\u00ea t\u00e1 realmente procurando, voc\u00ea t\u00e1 procurando ver de fora seu neg\u00f3cio, entender, tra\u00e7ar um plano estrat\u00e9gico pra voc\u00ea conseguir suprir essa demanda que veio, porque voc\u00ea cresceu, acredito que muito r\u00e1pido, correto? Sim. Entendi. Ent\u00e3o hoje seu principal desafio \u00e9 realmente gerir pessoas, ter essa gest\u00e3o pra voc\u00ea conseguir fazer seu papel da melhor forma a\u00ed como CEO, correto? Isso, isso. E voc\u00ea \u00e9 aqui de S\u00e3o Paulo mesmo? Vi que seu DDD \u00e9 11. Isso, capital. Perfeito. Ent\u00e3o pra voc\u00ea n\u00e3o \u00e9 um problema que a imers\u00e3o seja presencial aqui na capital, n\u00e9? N\u00e3o, n\u00e3o. Perfeito. Bom, eu acredito que voc\u00ea tem muito perfil pra t\u00e1 conhecendo um pouco mais sobre as imers\u00f5es. A gente tem umas muito legais que v\u00e3o te ajudar especialmente nesse quesito, que voc\u00ea pode ficar tranquilo, \u00e9 uma coisa que acaba sendo muito comum hoje, n\u00e9? Que realmente a gente cresce do nada, a gente t\u00e1 ali acostumada na produ\u00e7\u00e3o e \u00e9 dif\u00edcil largar. Temos realmente mentores muito qualificados pra te ajudar a tra\u00e7ar esse plano de a\u00e7\u00e3o. Essa parte aqui que voc\u00ea fala comigo \u00e9 realmente uma parte de relacionamento pra eu entender o seu perfil e te tra\u00e7ar pro especialista que vai te explicar um pouco melhor sobre cada imers\u00e3o, entender o seu posicionamento atual e te direcionar pra melhor delas. Aqui a gente vende a verdade mesmo, Sidney. Ent\u00e3o, a gente n\u00e3o t\u00e1 aqui pra te empurrar a imers\u00e3o, a gente t\u00e1 aqui pra entender o que vai funcionar ou n\u00e3o pra voc\u00ea e juntos, n\u00e9, saber que essa solu\u00e7\u00e3o, voc\u00ea realmente acreditar nela pra conseguir aplicar. Ah, legal. Gostaria de saber se eu posso estar te direcionando ainda hoje pra um desses especialistas. Como \u00e9 que t\u00e1 sua agenda? N\u00e3o, pode estar de boa. T\u00f4 na estrada, t\u00f4 em viagem aqui, se ligar eu consigo falar. \u00c9 que esse especialista, o que acontece, ele estuda sobre o seu mercado, ele tem pelo menos 30 minutinhos a\u00ed antes de falar com voc\u00ea e a gente faz essa conversa pelo Google Meet. \u00c9 uma conversa r\u00e1pida, consider\u00e1vel, porque \u00e9 de 20 a 30 minutos, mas que realmente ele precisa, ele j\u00e1 vai te trazer alguns insights, direcionamentos. Ent\u00e3o, a gente precisava realmente da sua aten\u00e7\u00e3o mais em foco sobre isso. Ah, t\u00e1 aqui at\u00e9 amanh\u00e3, ent\u00e3o. Hoje n\u00e3o d\u00e1, hoje a gente consegue se encaixar at\u00e9 a noite, viu? Eu t\u00f4 fora de S\u00e3o Paulo, eu vim fazer uma visita no cliente fora da capital, eu t\u00f4 em Campinas. Ah, o senhor t\u00e1 em Campinas, entendi. Sem problemas. Amanh\u00e3, como \u00e9 que t\u00e1 seu dia? Voc\u00ea prefere pela manh\u00e3, \u00e0 tarde, \u00e0 noite voc\u00ea consegue? Porque quando a gente marcar aqui na agenda, eu realmente ocupo um espa\u00e7o dele e eu n\u00e3o quero nem perder o seu tempo nem o dele, entende? Ah, umas 5 horas por a\u00ed da tarde pra mim j\u00e1 \u00e9 legal. Perfeito. Vamos fazer assim, ent\u00e3o. Amanh\u00e3 eu te ligo ali pelas 20 pra 5 pra gente confirmar que voc\u00ea vai entrar na call e eu j\u00e1 vou colocar na agenda dele. Perfeito, Sidney? Combinado, ent\u00e3o. Combinado. Muito obrigada pela sua aten\u00e7\u00e3o, viu? \u00d3timo dia. Tchau, tchau.\n",
"target": "- Nome do cliente: Sidney\n- Cargo do cliente: CEO/Presidente\n- Nome da empresa: Sid Rodas\n- Setor de atua\u00e7\u00e3o da empresa: Venda de rodas e pneus para carros.\n- Quanto a empresa fatura anualmente: Informa\u00e7\u00e3o n\u00e3o abordada na call\n- Quantos funcion\u00e1rios a empresa possu\u00ed: Informa\u00e7\u00e3o n\u00e3o abordada na call\n- Como conheceu o G4: Por meio de um amigo que \u00e9 dono da GP Imports e estuda em Harvard com um dos fundadores da G4.\n- A pessoa tem algum amigo, conhecido ou s\u00f3cio que j\u00e1 foi aluno ou cliente do G4? Sim, um amigo que estuda em Harvard com um dos fundadores do G4.\n- O que motivou a pessoa a buscar o G4 no atual momento? Sidney percebeu o crescimento de sua empresa e se sente perdido, precisando melhorar sua capacidade de delegar e gerir a empresa.\n- J\u00e1 consumiu algum produto do G4? Informa\u00e7\u00e3o n\u00e3o abordada na call\n- Principal dor apresentada pelo lead: Sidney percebeu que a empresa est\u00e1 crescendo e ele est\u00e1 se sentindo perdido, n\u00e3o conseguindo delegar e gerir de maneira eficiente.\n- Principais desafios apresentados pelo lead: Lidar com o crescimento da empresa, melhorar a gest\u00e3o de pessoas e aprender a delegar.\n- Foi informado ao lead que a imers\u00e3o \u00e9 presencial em S\u00e3o Paulo? Sim\n- Algum outro coment\u00e1rio \u00fatil para o time comercial: Sidney demonstrou interesse em participar de uma imers\u00e3o e parece bastante aberto a receber a ajuda oferecida pelo G4. Ele est\u00e1 com o desafio de escalonar sua empresa e parece motivado a resolver este problema. Al\u00e9m disso, ele tem disponibilidade para falar com um especialista \u00e0s 17h do dia seguinte."
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"target": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 170 |
| valid | 170 |
|
agangal/baseball-12 | ---
dataset_info:
features:
- name: image
dtype: image
- name: additional_feature
dtype: string
splits:
- name: train
num_bytes: 3790371.0
num_examples: 12
download_size: 3792311
dataset_size: 3790371.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
omarelsayeed/good_chats_dataset_pre_tokenization | ---
dataset_info:
features:
- name: Chat_ID
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 97515949
num_examples: 33735
download_size: 30316154
dataset_size: 97515949
---
# Dataset Card for "good_chats_dataset_pre_tokenization"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-IA3-v3 | ---
pretty_name: Evaluation run of yeontaek/Platypus2xOpenOrca-13B-IA3-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/Platypus2xOpenOrca-13B-IA3-v3](https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-IA3-v3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-IA3-v3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-22T11:48:46.198205](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-IA3-v3/blob/main/results_2023-10-22T11-48-46.198205.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.004404362416107382,\n\
\ \"em_stderr\": 0.0006781451620479675,\n \"f1\": 0.07597000838926182,\n\
\ \"f1_stderr\": 0.001647112822339397,\n \"acc\": 0.45089736370800626,\n\
\ \"acc_stderr\": 0.010370579775637361\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.004404362416107382,\n \"em_stderr\": 0.0006781451620479675,\n\
\ \"f1\": 0.07597000838926182,\n \"f1_stderr\": 0.001647112822339397\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12357846853677028,\n \
\ \"acc_stderr\": 0.009065050306776916\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7782162588792423,\n \"acc_stderr\": 0.011676109244497808\n\
\ }\n}\n```"
repo_url: https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-IA3-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T11_48_46.198205
path:
- '**/details_harness|drop|3_2023-10-22T11-48-46.198205.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T11-48-46.198205.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T11_48_46.198205
path:
- '**/details_harness|gsm8k|5_2023-10-22T11-48-46.198205.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-22T11-48-46.198205.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T11_48_46.198205
path:
- '**/details_harness|winogrande|5_2023-10-22T11-48-46.198205.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T11-48-46.198205.parquet'
- config_name: results
data_files:
- split: 2023_10_22T11_48_46.198205
path:
- results_2023-10-22T11-48-46.198205.parquet
- split: latest
path:
- results_2023-10-22T11-48-46.198205.parquet
---
# Dataset Card for Evaluation run of yeontaek/Platypus2xOpenOrca-13B-IA3-v3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-IA3-v3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/Platypus2xOpenOrca-13B-IA3-v3](https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-IA3-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-IA3-v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T11:48:46.198205](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-IA3-v3/blob/main/results_2023-10-22T11-48-46.198205.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.004404362416107382,
"em_stderr": 0.0006781451620479675,
"f1": 0.07597000838926182,
"f1_stderr": 0.001647112822339397,
"acc": 0.45089736370800626,
"acc_stderr": 0.010370579775637361
},
"harness|drop|3": {
"em": 0.004404362416107382,
"em_stderr": 0.0006781451620479675,
"f1": 0.07597000838926182,
"f1_stderr": 0.001647112822339397
},
"harness|gsm8k|5": {
"acc": 0.12357846853677028,
"acc_stderr": 0.009065050306776916
},
"harness|winogrande|5": {
"acc": 0.7782162588792423,
"acc_stderr": 0.011676109244497808
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Pav17/T3-gen-dataset-3 | ---
dataset_info:
features:
- name: task_id
dtype: int32
- name: text
dtype: string
- name: code
dtype: string
- name: test_list
sequence: string
- name: test_setup_code
dtype: string
- name: challenge_test_list
sequence: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 725719
num_examples: 374
- name: test
num_bytes: 984921
num_examples: 500
- name: validation
num_bytes: 174450
num_examples: 90
- name: prompt
num_bytes: 19060
num_examples: 10
download_size: 555357
dataset_size: 1904150
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
- split: prompt
path: data/prompt-*
---
|
ChristophSchuhmann/essays-with-instructions | ---
license: apache-2.0
---
|
Seanxh/twitter_dataset_1713216822 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 194829
num_examples: 456
download_size: 67581
dataset_size: 194829
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
KADUZADA/ED_MOTTA | ---
license: openrail
---
|
pavelmarcolian/echelon-fine-tuning-dataset | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 136874
num_examples: 613
download_size: 67640
dataset_size: 136874
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jiuyuan/train_cypher | ---
license: apache-2.0
---
|
HydraLM/filtered-1 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
- name: unique_conversation_id
dtype: string
- name: embedding
sequence: float32
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 13560566066
num_examples: 2297193
download_size: 13048058105
dataset_size: 13560566066
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "filtered-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.