datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
pvduy/code_prompt_evol_lemur | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: outputs
dtype: string
splits:
- name: train
num_bytes: 122799704
num_examples: 60000
download_size: 53813219
dataset_size: 122799704
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code_prompt_evol_lemur"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ltjabc/sanguosha | ---
license: other
---
|
tau/sled | ---
language:
- en
license:
- mit
task_categories:
- question-answering
- summarization
- text-generation
task_ids:
- multiple-choice-qa
- natural-language-inference
configs:
- gov_report
- summ_screen_fd
- qmsum
- qasper
- narrative_qa
- quality
- contract_nli
- squad
- squad_shuffled_distractors
- squad_ordered_distractors
- hotpotqa
- hotpotqa_second_only
tags:
- multi-hop-question-answering
- query-based-summarization
- long-texts
---
## Dataset Description
- **Repository:** [SLED Github repository](https://github.com/Mivg/SLED)
- **Paper:** [Efficient Long-Text Understanding with Short-Text Models
](https://arxiv.org/pdf/2208.00748.pdf)
# Dataset Card for SCROLLS
## Overview
This dataset is based on the [SCROLLS](https://huggingface.co/datasets/tau/scrolls) dataset ([paper](https://arxiv.org/pdf/2201.03533.pdf)), the [SQuAD 1.1](https://huggingface.co/datasets/squad) dataset and the [HotpotQA](https://huggingface.co/datasets/hotpot_qa) dataset.
It doesn't contain any unpblished data, but includes the configuration needed for the [Efficient Long-Text Understanding with Short-Text Models
](https://arxiv.org/pdf/2208.00748.pdf) paper.
## Tasks
The tasks included are:
#### GovReport ([Huang et al., 2021](https://arxiv.org/pdf/2104.02112.pdf))
GovReport is a summarization dataset of reports addressing various national policy issues published by the
Congressional Research Service and the U.S. Government Accountability Office, where each document is paired with a hand-written executive summary.
The reports and their summaries are longer than their equivalents in other popular long-document summarization datasets;
for example, GovReport's documents are approximately 1.5 and 2.5 times longer than the documents in Arxiv and PubMed, respectively.
#### SummScreenFD ([Chen et al., 2021](https://arxiv.org/pdf/2104.07091.pdf))
SummScreenFD is a summarization dataset in the domain of TV shows (e.g. Friends, Game of Thrones).
Given a transcript of a specific episode, the goal is to produce the episode's recap.
The original dataset is divided into two complementary subsets, based on the source of its community contributed transcripts.
For SCROLLS, we use the ForeverDreaming (FD) subset, as it incorporates 88 different shows,
making it a more diverse alternative to the TV MegaSite (TMS) subset, which has only 10 shows.
Community-authored recaps for the ForeverDreaming transcripts were collected from English Wikipedia and TVMaze.
#### QMSum ([Zhong et al., 2021](https://arxiv.org/pdf/2104.05938.pdf))
QMSum is a query-based summarization dataset, consisting of 232 meetings transcripts from multiple domains.
The corpus covers academic group meetings at the International Computer Science Institute and their summaries, industrial product meetings for designing a remote control,
and committee meetings of the Welsh and Canadian Parliaments, dealing with a variety of public policy issues.
Annotators were tasked with writing queries about the broad contents of the meetings, as well as specific questions about certain topics or decisions,
while ensuring that the relevant text for answering each query spans at least 200 words or 10 turns.
#### NarrativeQA ([Kočiský et al., 2021](https://arxiv.org/pdf/1712.07040.pdf))
NarrativeQA (Kočiský et al., 2021) is an established question answering dataset over entire books from Project Gutenberg and movie scripts from different websites.
Annotators were given summaries of the books and scripts obtained from Wikipedia, and asked to generate question-answer pairs,
resulting in about 30 questions and answers for each of the 1,567 books and scripts.
They were encouraged to use their own words rather then copying, and avoid asking yes/no questions or ones about the cast.
Each question was then answered by an additional annotator, providing each question with two reference answers (unless both answers are identical).
#### Qasper ([Dasigi et al., 2021](https://arxiv.org/pdf/2105.03011.pdf))
Qasper is a question answering dataset over NLP papers filtered from the Semantic Scholar Open Research Corpus (S2ORC).
Questions were written by NLP practitioners after reading only the title and abstract of the papers,
while another set of NLP practitioners annotated the answers given the entire document.
Qasper contains abstractive, extractive, and yes/no questions, as well as unanswerable ones.
#### QuALITY ([Pang et al., 2021](https://arxiv.org/pdf/2112.08608.pdf))
QuALITY is a multiple-choice question answering dataset over articles and stories sourced from Project Gutenberg,
the Open American National Corpus, and more.
Experienced writers wrote questions and distractors, and were incentivized to write answerable, unambiguous questions such that in order to correctly answer them,
human annotators must read large portions of the given document.
Reference answers were then calculated using the majority vote between of the annotators and writer's answers.
To measure the difficulty of their questions, Pang et al. conducted a speed validation process,
where another set of annotators were asked to answer questions given only a short period of time to skim through the document.
As a result, 50% of the questions in QuALITY are labeled as hard, i.e. the majority of the annotators in the speed validation setting chose the wrong answer.
#### ContractNLI ([Koreeda and Manning, 2021](https://arxiv.org/pdf/2110.01799.pdf))
Contract NLI is a natural language inference dataset in the legal domain.
Given a non-disclosure agreement (the premise), the task is to predict whether a particular legal statement (the hypothesis) is entailed, not entailed (neutral), or cannot be entailed (contradiction) from the contract.
The NDAs were manually picked after simple filtering from the Electronic Data Gathering, Analysis, and Retrieval system (EDGAR) and Google.
The dataset contains a total of 607 contracts and 17 unique hypotheses, which were combined to produce the dataset's 10,319 examples.
#### SQuAD 1.1 ([Rajpurkar et al., 2016](https://arxiv.org/pdf/1606.05250.pdf))
Stanford Question Answering Dataset (SQuAD) is a reading comprehension \
dataset, consisting of questions posed by crowdworkers on a set of Wikipedia \
articles, where the answer to every question is a segment of text, or span, \
from the corresponding reading passage, or the question might be unanswerable.
#### HotpotQA ([Yang et al., 2018](https://arxiv.org/pdf/1809.09600.pdf))
HotpotQA is a new dataset with 113k Wikipedia-based question-answer pairs with four key features:
(1) the questions require finding and reasoning over multiple supporting documents to answer;
(2) the questions are diverse and not constrained to any pre-existing knowledge bases or knowledge schemas;
(3) we provide sentence-level supporting facts required for reasoning, allowingQA systems to reason with strong supervisionand explain the predictions;
(4) we offer a new type of factoid comparison questions to testQA systems’ ability to extract relevant facts and perform necessary comparison.
## Data Fields
All the datasets in the benchmark are in the same input-output format
- `input`: a `string` feature. The input document.
- `input_prefix`: an optional `string` feature, for the datasets containing prefix (e.g. question)
- `output`: a `string` feature. The target.
- `id`: a `string` feature. Unique per input.
- `pid`: a `string` feature. Unique per input-output pair (can differ from 'id' in NarrativeQA and Qasper, where there is more then one valid target).
The dataset that contain `input_prefix` are:
- SQuAD - the question
- HotpotQA - the question
- qmsum - the query
- qasper - the question
- narrative_qa - the question
- quality - the question + the four choices
- contract_nli - the hypothesis
## Controlled experiments
To test multiple properties of SLED, we modify SQuAD 1.1 [Rajpurkar et al., 2016](https://arxiv.org/pdf/1606.05250.pdf)
and HotpotQA [Yang et al., 2018](https://arxiv.org/pdf/1809.09600.pdf) to create a few controlled experiments settings.
Those are accessible via the following configurations:
- squad - Contains the original version of SQuAD 1.1 (question + passage)
- squad_ordered_distractors - For each example, 9 random distrctor passages are concatenated (separated by '\n')
- squad_shuffled_distractors - For each example, 9 random distrctor passages are added (separated by '\n'), and jointly the 10 passages are randomly shuffled
- hotpotqa - A clean version of HotpotQA, where each input contains only the two gold passages (separated by '\n')
- hotpotqa_second_only - In each example, the input contains only the second gold passage
## Citation
If you use this dataset, **please make sure to cite all the original dataset papers as well SCROLLS.** [[bibtex](https://drive.google.com/uc?export=download&id=1IUYIzQD9DPsECw0JWkwk4Ildn8JOMtuU)]
```
@inproceedings{Ivgi2022EfficientLU,
title={Efficient Long-Text Understanding with Short-Text Models},
author={Maor Ivgi and Uri Shaham and Jonathan Berant},
year={2022}
}
``` |
tyzhu/random25eof_find_passage_train1000_eval100_rare | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 219292
num_examples: 2100
- name: validation
num_bytes: 11904
num_examples: 100
download_size: 0
dataset_size: 231196
---
# Dataset Card for "random25eof_find_passage_train1000_eval100_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
USGOV/gov | ---
license: other
license_name: government-open-source
license_link: LICENSE
---
|
yardeny/t5_small_test_set_context_len_64 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 419840
num_examples: 1280
download_size: 171561
dataset_size: 419840
---
# Dataset Card for "t5_small_test_set_context_len_64"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/mmarco_v2_ru_train | ---
pretty_name: '`mmarco/v2/ru/train`'
viewer: false
source_datasets: ['irds/mmarco_v2_ru']
task_categories:
- text-retrieval
---
# Dataset Card for `mmarco/v2/ru/train`
The `mmarco/v2/ru/train` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mmarco#mmarco/v2/ru/train).
# Data
This dataset provides:
- `queries` (i.e., topics); count=808,731
- `qrels`: (relevance assessments); count=532,761
- `docpairs`; count=39,780,811
- For `docs`, use [`irds/mmarco_v2_ru`](https://huggingface.co/datasets/irds/mmarco_v2_ru)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/mmarco_v2_ru_train', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/mmarco_v2_ru_train', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
docpairs = load_dataset('irds/mmarco_v2_ru_train', 'docpairs')
for record in docpairs:
record # {'query_id': ..., 'doc_id_a': ..., 'doc_id_b': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Bonifacio2021MMarco,
title={{mMARCO}: A Multilingual Version of {MS MARCO} Passage Ranking Dataset},
author={Luiz Henrique Bonifacio and Israel Campiotti and Roberto Lotufo and Rodrigo Nogueira},
year={2021},
journal={arXiv:2108.13897}
}
```
|
kyujinpy/Mistral-EN-DPO-9K | ---
language:
- en
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 30504054
num_examples: 9181
download_size: 16935806
dataset_size: 30504054
---
# Dataset Card for "Mistral-EN-DPO-9K"
# Info
We used [snorkelai/Snorkel-Mistral-PairRM-DPO-Dataset](https://huggingface.co/datasets/snorkelai/Snorkel-Mistral-PairRM-DPO-Dataset?row=10) dataset.
We selected [train_iteration_1](https://huggingface.co/datasets/snorkelai/Snorkel-Mistral-PairRM-DPO-Dataset/viewer) part.
# Pre-processing
1. Remove coding task
```
Filtering words: ['[Latex]', 'java', 'SQL', 'C#', 'nextjs', 'react', 'Ruby', 'Lua', 'Unity', 'XML', 'qrcode', 'jest', 'const',
'python', 'Python', 'R code', 'Next.js', 'Node.js', 'Typescript', 'HTML', 'php', 'skeleton code',
'MATLAB', 'using js', ...]
```
2. Remove translation task
```
Filtering words: ['Japanese', 'Chinese', 'translat', 'Bengali', 'Marathi', 'Translat', 'Portuguese',
'Hindi', 'Persian', 'Lao', 'French', ...]
```
# # of datasets: About 9K |
Ndumiso99/CV_Data | ---
license: apache-2.0
---
|
chillguypoonawala/dsdsd | ---
tags:
- sike
saga: costella
license: creativeml-openrail-m
task_categories:
- translation
language:
- ak
- ae
pretty_name: dsdsd
size_categories:
- 10M<n<100M
--- |
autoevaluate/autoeval-eval-futin__feed-sen_vi-b48d12-2175169953 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/feed
eval_info:
task: text_zero_shot_classification
model: bigscience/bloom-1b7
metrics: []
dataset_name: futin/feed
dataset_config: sen_vi
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloom-1b7
* Dataset: futin/feed
* Config: sen_vi
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
anaili/title_BREAST_embdeddings | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_Sao10K__Fimbulvetr-11B-v2-Test-14 | ---
pretty_name: Evaluation run of Sao10K/Fimbulvetr-11B-v2-Test-14
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/Fimbulvetr-11B-v2-Test-14](https://huggingface.co/Sao10K/Fimbulvetr-11B-v2-Test-14)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Fimbulvetr-11B-v2-Test-14\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-09T16:00:05.940666](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Fimbulvetr-11B-v2-Test-14/blob/main/results_2024-02-09T16-00-05.940666.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6705816022863044,\n\
\ \"acc_stderr\": 0.03151142284634416,\n \"acc_norm\": 0.671931569096393,\n\
\ \"acc_norm_stderr\": 0.032148330655539875,\n \"mc1\": 0.47613219094247244,\n\
\ \"mc1_stderr\": 0.017483547156961574,\n \"mc2\": 0.6342749025395696,\n\
\ \"mc2_stderr\": 0.0156107236020673\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6655290102389079,\n \"acc_stderr\": 0.013787460322441372,\n\
\ \"acc_norm\": 0.7005119453924915,\n \"acc_norm_stderr\": 0.013385021637313576\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.696673969328819,\n\
\ \"acc_stderr\": 0.00458755357710126,\n \"acc_norm\": 0.877912766381199,\n\
\ \"acc_norm_stderr\": 0.00326717445844976\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7631578947368421,\n \"acc_stderr\": 0.034597776068105365,\n\
\ \"acc_norm\": 0.7631578947368421,\n \"acc_norm_stderr\": 0.034597776068105365\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337142,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337142\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6127659574468085,\n \"acc_stderr\": 0.03184389265339526,\n\
\ \"acc_norm\": 0.6127659574468085,\n \"acc_norm_stderr\": 0.03184389265339526\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4708994708994709,\n \"acc_stderr\": 0.025707658614154964,\n \"\
acc_norm\": 0.4708994708994709,\n \"acc_norm_stderr\": 0.025707658614154964\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8290322580645161,\n\
\ \"acc_stderr\": 0.02141724293632158,\n \"acc_norm\": 0.8290322580645161,\n\
\ \"acc_norm_stderr\": 0.02141724293632158\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\"\
: 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03011768892950357,\n\
\ \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03011768892950357\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.02406315641682252,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.02406315641682252\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033446,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033446\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616258,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616258\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7310924369747899,\n \"acc_stderr\": 0.028801392193631276,\n\
\ \"acc_norm\": 0.7310924369747899,\n \"acc_norm_stderr\": 0.028801392193631276\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8568807339449541,\n \"acc_stderr\": 0.01501446249716859,\n \"\
acc_norm\": 0.8568807339449541,\n \"acc_norm_stderr\": 0.01501446249716859\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5740740740740741,\n \"acc_stderr\": 0.033723432716530624,\n \"\
acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.033723432716530624\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8725490196078431,\n \"acc_stderr\": 0.023405530480846322,\n \"\
acc_norm\": 0.8725490196078431,\n \"acc_norm_stderr\": 0.023405530480846322\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8734177215189873,\n \"acc_stderr\": 0.021644195727955173,\n \
\ \"acc_norm\": 0.8734177215189873,\n \"acc_norm_stderr\": 0.021644195727955173\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.03826076324884864,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.03826076324884864\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026622,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026622\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841403,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841403\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8058748403575989,\n\
\ \"acc_stderr\": 0.014143970276657567,\n \"acc_norm\": 0.8058748403575989,\n\
\ \"acc_norm_stderr\": 0.014143970276657567\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546835,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546835\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4983240223463687,\n\
\ \"acc_stderr\": 0.016722407608296398,\n \"acc_norm\": 0.4983240223463687,\n\
\ \"acc_norm_stderr\": 0.016722407608296398\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.02540383297817962,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.02540383297817962\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7716049382716049,\n \"acc_stderr\": 0.023358211840626267,\n\
\ \"acc_norm\": 0.7716049382716049,\n \"acc_norm_stderr\": 0.023358211840626267\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5141843971631206,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.5141843971631206,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5071707953063885,\n\
\ \"acc_stderr\": 0.012768922739553308,\n \"acc_norm\": 0.5071707953063885,\n\
\ \"acc_norm_stderr\": 0.012768922739553308\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7463235294117647,\n \"acc_stderr\": 0.026431329870789513,\n\
\ \"acc_norm\": 0.7463235294117647,\n \"acc_norm_stderr\": 0.026431329870789513\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174927,\n\
\ \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174927\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.02519692987482707,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.02519692987482707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n\
\ \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n\
\ \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.47613219094247244,\n\
\ \"mc1_stderr\": 0.017483547156961574,\n \"mc2\": 0.6342749025395696,\n\
\ \"mc2_stderr\": 0.0156107236020673\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.829518547750592,\n \"acc_stderr\": 0.010569021122825912\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6482183472327521,\n \
\ \"acc_stderr\": 0.013153446023536044\n }\n}\n```"
repo_url: https://huggingface.co/Sao10K/Fimbulvetr-11B-v2-Test-14
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|arc:challenge|25_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|gsm8k|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hellaswag|10_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T16-00-05.940666.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T16-00-05.940666.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- '**/details_harness|winogrande|5_2024-02-09T16-00-05.940666.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-09T16-00-05.940666.parquet'
- config_name: results
data_files:
- split: 2024_02_09T16_00_05.940666
path:
- results_2024-02-09T16-00-05.940666.parquet
- split: latest
path:
- results_2024-02-09T16-00-05.940666.parquet
---
# Dataset Card for Evaluation run of Sao10K/Fimbulvetr-11B-v2-Test-14
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Sao10K/Fimbulvetr-11B-v2-Test-14](https://huggingface.co/Sao10K/Fimbulvetr-11B-v2-Test-14) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Fimbulvetr-11B-v2-Test-14",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-09T16:00:05.940666](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Fimbulvetr-11B-v2-Test-14/blob/main/results_2024-02-09T16-00-05.940666.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6705816022863044,
"acc_stderr": 0.03151142284634416,
"acc_norm": 0.671931569096393,
"acc_norm_stderr": 0.032148330655539875,
"mc1": 0.47613219094247244,
"mc1_stderr": 0.017483547156961574,
"mc2": 0.6342749025395696,
"mc2_stderr": 0.0156107236020673
},
"harness|arc:challenge|25": {
"acc": 0.6655290102389079,
"acc_stderr": 0.013787460322441372,
"acc_norm": 0.7005119453924915,
"acc_norm_stderr": 0.013385021637313576
},
"harness|hellaswag|10": {
"acc": 0.696673969328819,
"acc_stderr": 0.00458755357710126,
"acc_norm": 0.877912766381199,
"acc_norm_stderr": 0.00326717445844976
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7631578947368421,
"acc_stderr": 0.034597776068105365,
"acc_norm": 0.7631578947368421,
"acc_norm_stderr": 0.034597776068105365
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337142,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337142
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6127659574468085,
"acc_stderr": 0.03184389265339526,
"acc_norm": 0.6127659574468085,
"acc_norm_stderr": 0.03184389265339526
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4708994708994709,
"acc_stderr": 0.025707658614154964,
"acc_norm": 0.4708994708994709,
"acc_norm_stderr": 0.025707658614154964
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8290322580645161,
"acc_stderr": 0.02141724293632158,
"acc_norm": 0.8290322580645161,
"acc_norm_stderr": 0.02141724293632158
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03011768892950357,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03011768892950357
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.02406315641682252,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.02406315641682252
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033446,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033446
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616258,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616258
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7310924369747899,
"acc_stderr": 0.028801392193631276,
"acc_norm": 0.7310924369747899,
"acc_norm_stderr": 0.028801392193631276
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8568807339449541,
"acc_stderr": 0.01501446249716859,
"acc_norm": 0.8568807339449541,
"acc_norm_stderr": 0.01501446249716859
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.033723432716530624,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.033723432716530624
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8725490196078431,
"acc_stderr": 0.023405530480846322,
"acc_norm": 0.8725490196078431,
"acc_norm_stderr": 0.023405530480846322
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8734177215189873,
"acc_stderr": 0.021644195727955173,
"acc_norm": 0.8734177215189873,
"acc_norm_stderr": 0.021644195727955173
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03826076324884864,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03826076324884864
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026622,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026622
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841403,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841403
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8058748403575989,
"acc_stderr": 0.014143970276657567,
"acc_norm": 0.8058748403575989,
"acc_norm_stderr": 0.014143970276657567
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546835,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546835
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4983240223463687,
"acc_stderr": 0.016722407608296398,
"acc_norm": 0.4983240223463687,
"acc_norm_stderr": 0.016722407608296398
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.02540383297817962,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.02540383297817962
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7716049382716049,
"acc_stderr": 0.023358211840626267,
"acc_norm": 0.7716049382716049,
"acc_norm_stderr": 0.023358211840626267
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5141843971631206,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.5141843971631206,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5071707953063885,
"acc_stderr": 0.012768922739553308,
"acc_norm": 0.5071707953063885,
"acc_norm_stderr": 0.012768922739553308
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7463235294117647,
"acc_stderr": 0.026431329870789513,
"acc_norm": 0.7463235294117647,
"acc_norm_stderr": 0.026431329870789513
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.018824219512706207,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.018824219512706207
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.027529637440174927,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.027529637440174927
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482707,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.47613219094247244,
"mc1_stderr": 0.017483547156961574,
"mc2": 0.6342749025395696,
"mc2_stderr": 0.0156107236020673
},
"harness|winogrande|5": {
"acc": 0.829518547750592,
"acc_stderr": 0.010569021122825912
},
"harness|gsm8k|5": {
"acc": 0.6482183472327521,
"acc_stderr": 0.013153446023536044
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ayeshgk/java_bug_fix_ctx_err_small_2 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: bug_ctx
dtype: string
- name: fixed
dtype: string
splits:
- name: train
num_bytes: 57631
num_examples: 120
- name: validation
num_bytes: 19635
num_examples: 36
- name: test
num_bytes: 687
num_examples: 4
download_size: 21881
dataset_size: 77953
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
anforsm/movie_posters-genres-80k-transformed | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: int64
- name: image
sequence:
sequence:
sequence: float32
- name: genres
sequence: int64
splits:
- name: train
num_bytes: 23128566147.416473
num_examples: 78352
- name: test
num_bytes: 295187948.58352655
num_examples: 1000
download_size: 22030369211
dataset_size: 23423754096.0
---
# Dataset Card for "movie_posters-genres-80k-transformed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AryanLala/autonlp-data-Scientific_Title_Generator | ---
task_categories:
- conditional-text-generation
---
# AutoNLP Dataset for project: Scientific_Title_Generator
## Table of content
- [Dataset Description](#dataset-description)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
## Dataset Descritpion
This dataset has been automatically processed by AutoNLP for project Scientific_Title_Generator.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"target": "Unification of Fusion Theories, Rules, Filters, Image Fusion and Target\n Tracking Methods (UFT)",
"text": " The author has pledged in various papers, conference or seminar\npresentations, and scientific grant applications (between 2004-2015) for the\nunification of fusion theories, combinations of fusion rules, image fusion\nprocedures, filter algorithms, and target tracking methods for more accurate\napplications to our real world problems - since neither fusion theory nor\nfusion rule fully satisfy all needed applications. For each particular\napplication, one selects the most appropriate fusion space and fusion model,\nthen the fusion rules, and the algorithms of implementation. He has worked in\nthe Unification of the Fusion Theories (UFT), which looks like a cooking\nrecipe, better one could say like a logical chart for a computer programmer,\nbut one does not see another method to comprise/unify all things. The\nunification scenario presented herein, which is now in an incipient form,\nshould periodically be updated incorporating new discoveries from the fusion\nand engineering research.\n"
},
{
"target": "Investigation of Variances in Belief Networks",
"text": " The belief network is a well-known graphical structure for representing\nindependences in a joint probability distribution. The methods, which perform\nprobabilistic inference in belief networks, often treat the conditional\nprobabilities which are stored in the network as certain values. However, if\none takes either a subjectivistic or a limiting frequency approach to\nprobability, one can never be certain of probability values. An algorithm\nshould not only be capable of reporting the probabilities of the alternatives\nof remaining nodes when other nodes are instantiated; it should also be capable\nof reporting the uncertainty in these probabilities relative to the uncertainty\nin the probabilities which are stored in the network. In this paper a method\nfor determining the variances in inferred probabilities is obtained under the\nassumption that a posterior distribution on the uncertainty variables can be\napproximated by the prior distribution. It is shown that this assumption is\nplausible if their is a reasonable amount of confidence in the probabilities\nwhich are stored in the network. Furthermore in this paper, a surprising upper\nbound for the prior variances in the probabilities of the alternatives of all\nnodes is obtained in the case where the probability distributions of the\nprobabilities of the alternatives are beta distributions. It is shown that the\nprior variance in the probability at an alternative of a node is bounded above\nby the largest variance in an element of the conditional probability\ndistribution for that node.\n"
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"target": "Value(dtype='string', id=None)",
"text": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 5784 |
| valid | 1446 |
|
JiggaBooJombs/Novelist | ---
license: apache-2.0
---
|
ivelin/rico_sca_refexp_synthetic_saved | ---
dataset_info:
features:
- name: image
dtype: image
- name: image_id
dtype: string
- name: labels
list:
- name: prompt
dtype: string
- name: target_bounding_box
struct:
- name: xmin
dtype: float32
- name: ymin
dtype: float32
- name: xmax
dtype: float32
- name: ymax
dtype: float32
splits:
- name: train
num_bytes: 2604982403.694
num_examples: 24063
- name: validation
num_bytes: 21192787.0
num_examples: 160
- name: test
num_bytes: 22057836.0
num_examples: 185
download_size: 2096931333
dataset_size: 2648233026.694
---
# Dataset Card for "rico_sca_refexp_synthetic_saved"
This is a saved snapshot of the dynamically generated [Rico SCA RefExp dataset](https://huggingface.co/datasets/ivelin/rico_sca_refexp_synthetic) |
sethapun/imdb_misspelled_50 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': neg
'1': pos
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 33633024
num_examples: 25000
- name: validation
num_bytes: 32850470
num_examples: 25000
download_size: 45974325
dataset_size: 66483494
---
# Dataset Card for "imdb_misspelled_50"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-_fil_self_160m_bo2_100_kl_0.1_prm_70m_thr_1.0_seed_3 | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: index
dtype: int64
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43586042
num_examples: 18929
- name: epoch_1
num_bytes: 44151991
num_examples: 18929
- name: epoch_2
num_bytes: 44251667
num_examples: 18929
- name: epoch_3
num_bytes: 44314260
num_examples: 18929
- name: epoch_4
num_bytes: 44366122
num_examples: 18929
- name: epoch_5
num_bytes: 44400390
num_examples: 18929
- name: epoch_6
num_bytes: 44415561
num_examples: 18929
- name: epoch_7
num_bytes: 44437049
num_examples: 18929
- name: epoch_8
num_bytes: 44446436
num_examples: 18929
- name: epoch_9
num_bytes: 44455579
num_examples: 18929
- name: epoch_10
num_bytes: 44459676
num_examples: 18929
- name: epoch_11
num_bytes: 44462711
num_examples: 18929
- name: epoch_12
num_bytes: 44463325
num_examples: 18929
- name: epoch_13
num_bytes: 44465643
num_examples: 18929
- name: epoch_14
num_bytes: 44464360
num_examples: 18929
- name: epoch_15
num_bytes: 44467564
num_examples: 18929
- name: epoch_16
num_bytes: 44471601
num_examples: 18929
- name: epoch_17
num_bytes: 44469631
num_examples: 18929
- name: epoch_18
num_bytes: 44469922
num_examples: 18929
- name: epoch_19
num_bytes: 44471042
num_examples: 18929
- name: epoch_20
num_bytes: 44471127
num_examples: 18929
- name: epoch_21
num_bytes: 44469454
num_examples: 18929
- name: epoch_22
num_bytes: 44472951
num_examples: 18929
- name: epoch_23
num_bytes: 44472405
num_examples: 18929
- name: epoch_24
num_bytes: 44474227
num_examples: 18929
- name: epoch_25
num_bytes: 44476591
num_examples: 18929
- name: epoch_26
num_bytes: 44472672
num_examples: 18929
- name: epoch_27
num_bytes: 44476029
num_examples: 18929
- name: epoch_28
num_bytes: 44476435
num_examples: 18929
- name: epoch_29
num_bytes: 44477999
num_examples: 18929
download_size: 699959989
dataset_size: 1332230462
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
- split: epoch_10
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-*
- split: epoch_20
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-*
---
|
jgp756/m | ---
license: openrail
---
|
changpt/ko-lima-vicuna | ---
license: cc-by-2.0
task_categories:
- text-generation
language:
- ko
size_categories:
- n<1K
pretty_name: KoLima(vicuna)
tags:
- KoLima
---
# Ko Lima Vicuna Dataset
GPT4 API를 사용하여 [lima_vicuna_format 데이터](https://huggingface.co/datasets/64bits/lima_vicuna_format)를 한국어로 재생성한 데이터셋입니다.
GPT4 사용시 프롬프트는 "단순 번역이 아닌, 원문 내용을 참조하여 데이터를 재생성하라"는 내용으로 작성하였습니다.
원본 데이터셋 1,030건 중 866건의 데이터가 재생성되었습니다.
*데이터셋 제작은 웹 크롤링 전문 기업, **[해시스크래퍼(HASHSCRAPER)](https://www.hashscraper.com/)** 에서 진행되었습니다.*
*This is the lima_vicuna_format data regenerated into Korean using the GPT4 API.*
*When using GPT4, the prompt was written to read "Regenerate the data by referring to the original content, not just a simple translation".*
*Out of the 1,030 original datasets, 866 data points were regenerated.*
*The dataset was created by [HASHSCRAPER](https://www.hashscraper.com/), a company specializing in web scraping.*
---
# How to use
```python
>>> from datasets import load_dataset
>>> data = load_dataset("changpt/ko-lima-vicuna", data_files="ko_lima_vicuna.json", split="train")
>>> data
Dataset({
features: ['id', 'conversations'],
num_rows: 866
})
```
```python
>>> data[0]
{'id': '7-stackexchange',
'conversations': [{'from': 'human',
'value': '나는 ```ffmpeg```로 비디오를 GIF 파일로 변환하고 있어:\n```ffmpeg \\\n -i input.flv \\\n -ss 00:00:00.000 \\\n -pix_fmt rgb24 \\\n -r 10 \\\n -s 320x240 \\\n -t 00:00:10.000 \\\n output.gif\n```\n잘 작동하는데, 출력된 gif 파일의 품질이 너무 낮아. 변환된 gif의 품질을 어떻게 개선할 수 있을까?'},
{'from': 'gpt',
'value': '중간 이미지 파일을 피하고 싶다면, LordNeckBeard가 제공한 명령어를 ```ffmpeg```와 ImageMagick의 ```convert``` 사이에 파이프로 연결할 수 있습니다. 이렇게 하면 중간 파일이 필요하지 않습니다:\n```ffmpeg -i input.flv -vf scale=320:-1 -r 10 -f image2pipe -vcodec ppm - | convert -delay 10 -loop 0 - output.gif\n```\n```-f image2pipe```는 ffmpeg에게 비디오를 이미지로 분할하고 파이프 출력에 적합하게 만들라고 지시하며, ```-vcodec ppm```은 출력 형식을 ppm으로 지정합니다(png 형식인 경우, ```convert```가 파이프에서 모든 이미지를 읽지 않거나 ffmpeg가 모두 출력하지 않는 것 같습니다). 두 명령어의 ```-```는 각각 출력과 입력에 파이프를 사용하도록 지정합니다.\n파일을 저장하지 않고 결과를 최적화하려면, ```convert```의 출력을 두 번째 ```convert``` 명령어로 파이프할 수 있습니다:\n```ffmpeg -i input.flv -vf scale=320:-1 -r 10 -f image2pipe -vcodec ppm - | convert -delay 10 -loop 0 - gif:- | convert -layers Optimize - output.gif\n```\n```gif:-```는 ```convert```에게 출력을 gif 형식의 데이터로 파이프하도록 지시하고, ```-layers Optimize```는 두 번째 ```convert```에게 ```optimize-frame``` 및 ```optimize-transparancy``` 방법을 수행하도록 지시합니다(ImageMagick 애니메이션 최적화 소개 참조). ```-layers Optimize```의 출력이 항상 더 작은 파일 크기를 제공하지는 않으므로, 먼저 최적화 없이 gif로 변환해 보는 것이 좋습니다.\n이 과정에서 모든 것이 메모리에 있으므로 이미지가 매우 큰 경우 충분한 메모리가 필요할 수 있습니다.'}]}
```
---
# License
[CC BY 2.0 KR](https://creativecommons.org/licenses/by/2.0/kr/)
[Open AI](https://openai.com/policies/terms-of-use) |
mtc/multirc | ---
dataset_info:
features:
- name: document
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 1465576
num_examples: 4446
download_size: 194622
dataset_size: 1465576
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Chat-Error/short-stories | ---
tags:
- not-for-all-audiences
task_categories:
- conversational
language:
- en
---
Credit to Suikamelon |
mbazaNLP/kinyarwanda-tts-dataset | ---
language:
- rw
language_creators:
- "Digital Umuganda"
license:
- cc-by-4.0
size_categories:
- 3K<n<4K
- ~6hours
---
# Kinyarwanda TTS dataset
The dataset consists of 3992 clips of Kinyarwanda TTS corpus recorded in a studio using a voice actress, it was collected in the mbaza project
## Data structure
```
Audio: 3992 Single voice studio recordings by a voice actress
Text: CSV with audio name and corresponding written text
```
## Language
The corresponding dataset is in the Kinyarwanda Language
## Dataset Creation
- Text collected had to include Kinyarwanda syllabes, which is made by a combination of a consonant or a group of consonats (e.g. Nyw) and a vowel.
- Text were reviewed by a linguist to ensure the text fit kinyarwanda standards
- The voice were recorded in a studio albeit in a semi-professional settings (i.e. some of the audio contains reverbs)
|
guanaco/guanaco_clean | ---
license: cc-by-nc-sa-4.0
---
|
Dahoas/split_no_nl_cot_gsm8k | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: next_sent
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 17568449
num_examples: 22929
- name: test
num_bytes: 3317058
num_examples: 4282
- name: val
num_bytes: 607296
num_examples: 793
download_size: 5199426
dataset_size: 21492803
---
# Dataset Card for "split_no_nl_cot_gsm8k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
000hen/captchaCode | ---
license: apache-2.0
---
|
byroneverson/shell-cmd-instruct | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
tags:
- instruction-finetuning
pretty_name: Shell Command Instruct
---
# **Used to train models that interact directly with shells**
Follow-up details of my process
- MacOS terminal commands for now. This dataset is still in alpha stages and will be modified.
- Contains 500 somewhat unique training examples so far.
- GPT4 seems like a good candidate for generating more data, licensing would need to be addressed.
- I fine-tuned Solar-10.7B-Instruct-v1.0 with this dataset using a slightly modified version of axolotl. Just a few epochs was enough to get it to output correctly.
- I use oobabooga/text-generation-webui with a custom chat extension for inference. No sandbox is used, it is piped directly into MacOS bash because I'm reckless. C:
- Currently working towards training an MoE (2x7B), multi-modal model (image/text) with this dataset. (BakLLaVA-1-7B + LLaVA-v1.5-7B)
- Inference stages:
1. Send the instruction to the model, expect command.
2. Detect shell command and send to sand-boxed shell.
4. Shell respose should be sent as additional input to model.
5. The final model response should be sent to user from assistant.
TODO:
- Possible "os" column to specify which system the command should be used with, maybe separate datasets for each system type.
## **Sample prompt: (in series, depends on your specific model prompt)**
```
### User:
List files in 'Downloads'
### Command:
ls ~/Downloads
```
```
### Shell:
file1.pdf file2.txt file3.zip
### Assistant:
Listing files in 'Downloads': file1.pdf file2.txt file3.zip
```
|
open-llm-leaderboard/details_uukuguy__neural-chat-7b-v3-1-dare-0.85 | ---
pretty_name: Evaluation run of uukuguy/neural-chat-7b-v3-1-dare-0.85
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uukuguy/neural-chat-7b-v3-1-dare-0.85](https://huggingface.co/uukuguy/neural-chat-7b-v3-1-dare-0.85)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__neural-chat-7b-v3-1-dare-0.85\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-07T21:41:35.710987](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__neural-chat-7b-v3-1-dare-0.85/blob/main/results_2023-12-07T21-41-35.710987.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6418688892912601,\n\
\ \"acc_stderr\": 0.032096127796783475,\n \"acc_norm\": 0.6471999464850214,\n\
\ \"acc_norm_stderr\": 0.03273684834996561,\n \"mc1\": 0.3047735618115055,\n\
\ \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.4490316447301984,\n\
\ \"mc2_stderr\": 0.014392988561545562\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5776450511945392,\n \"acc_stderr\": 0.01443413871337998,\n\
\ \"acc_norm\": 0.6194539249146758,\n \"acc_norm_stderr\": 0.014188277712349812\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6362278430591516,\n\
\ \"acc_stderr\": 0.0048010096576904405,\n \"acc_norm\": 0.8383788090021908,\n\
\ \"acc_norm_stderr\": 0.0036735065123709503\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926604,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926604\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.03745554791462456,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.03745554791462456\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082637,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082637\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.02525303255499769,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.02525303255499769\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\
\ \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n\
\ \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.032250781083062896,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.032250781083062896\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251976,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251976\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515002,\n \
\ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515002\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8256880733944955,\n\
\ \"acc_stderr\": 0.016265675632010354,\n \"acc_norm\": 0.8256880733944955,\n\
\ \"acc_norm_stderr\": 0.016265675632010354\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n\
\ \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640766,\n \"\
acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640766\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.0306365913486998,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.0306365913486998\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709218,\n\
\ \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709218\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n\
\ \"acc_stderr\": 0.013927751372001505,\n \"acc_norm\": 0.8135376756066411,\n\
\ \"acc_norm_stderr\": 0.013927751372001505\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2916201117318436,\n\
\ \"acc_stderr\": 0.015201032512520429,\n \"acc_norm\": 0.2916201117318436,\n\
\ \"acc_norm_stderr\": 0.015201032512520429\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7712418300653595,\n \"acc_stderr\": 0.024051029739912258,\n\
\ \"acc_norm\": 0.7712418300653595,\n \"acc_norm_stderr\": 0.024051029739912258\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.024922001168886335,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.024922001168886335\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4452411994784876,\n\
\ \"acc_stderr\": 0.012693421303973294,\n \"acc_norm\": 0.4452411994784876,\n\
\ \"acc_norm_stderr\": 0.012693421303973294\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.01890101532209309,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.01890101532209309\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306046,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306046\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3047735618115055,\n\
\ \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.4490316447301984,\n\
\ \"mc2_stderr\": 0.014392988561545562\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7916337805840569,\n \"acc_stderr\": 0.011414554399987727\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.42153146322971946,\n \
\ \"acc_stderr\": 0.013601824409483272\n }\n}\n```"
repo_url: https://huggingface.co/uukuguy/neural-chat-7b-v3-1-dare-0.85
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|arc:challenge|25_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|gsm8k|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hellaswag|10_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-07T21-41-35.710987.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-07T21-41-35.710987.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- '**/details_harness|winogrande|5_2023-12-07T21-41-35.710987.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-07T21-41-35.710987.parquet'
- config_name: results
data_files:
- split: 2023_12_07T21_41_35.710987
path:
- results_2023-12-07T21-41-35.710987.parquet
- split: latest
path:
- results_2023-12-07T21-41-35.710987.parquet
---
# Dataset Card for Evaluation run of uukuguy/neural-chat-7b-v3-1-dare-0.85
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/uukuguy/neural-chat-7b-v3-1-dare-0.85
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [uukuguy/neural-chat-7b-v3-1-dare-0.85](https://huggingface.co/uukuguy/neural-chat-7b-v3-1-dare-0.85) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__neural-chat-7b-v3-1-dare-0.85",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-07T21:41:35.710987](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__neural-chat-7b-v3-1-dare-0.85/blob/main/results_2023-12-07T21-41-35.710987.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6418688892912601,
"acc_stderr": 0.032096127796783475,
"acc_norm": 0.6471999464850214,
"acc_norm_stderr": 0.03273684834996561,
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882455,
"mc2": 0.4490316447301984,
"mc2_stderr": 0.014392988561545562
},
"harness|arc:challenge|25": {
"acc": 0.5776450511945392,
"acc_stderr": 0.01443413871337998,
"acc_norm": 0.6194539249146758,
"acc_norm_stderr": 0.014188277712349812
},
"harness|hellaswag|10": {
"acc": 0.6362278430591516,
"acc_stderr": 0.0048010096576904405,
"acc_norm": 0.8383788090021908,
"acc_norm_stderr": 0.0036735065123709503
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926604,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03745554791462456,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03745554791462456
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082637,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082637
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.02525303255499769,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.02525303255499769
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.032250781083062896,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.032250781083062896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251976,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251976
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.016265675632010354,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.016265675632010354
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640766,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640766
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.0306365913486998,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.0306365913486998
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709218,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709218
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001505,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001505
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2916201117318436,
"acc_stderr": 0.015201032512520429,
"acc_norm": 0.2916201117318436,
"acc_norm_stderr": 0.015201032512520429
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7712418300653595,
"acc_stderr": 0.024051029739912258,
"acc_norm": 0.7712418300653595,
"acc_norm_stderr": 0.024051029739912258
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.024922001168886335,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.024922001168886335
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4452411994784876,
"acc_stderr": 0.012693421303973294,
"acc_norm": 0.4452411994784876,
"acc_norm_stderr": 0.012693421303973294
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.01890101532209309,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.01890101532209309
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306046,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306046
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882455,
"mc2": 0.4490316447301984,
"mc2_stderr": 0.014392988561545562
},
"harness|winogrande|5": {
"acc": 0.7916337805840569,
"acc_stderr": 0.011414554399987727
},
"harness|gsm8k|5": {
"acc": 0.42153146322971946,
"acc_stderr": 0.013601824409483272
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
irds/nfcorpus_dev | ---
pretty_name: '`nfcorpus/dev`'
viewer: false
source_datasets: ['irds/nfcorpus']
task_categories:
- text-retrieval
---
# Dataset Card for `nfcorpus/dev`
The `nfcorpus/dev` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/nfcorpus#nfcorpus/dev).
# Data
This dataset provides:
- `queries` (i.e., topics); count=325
- `qrels`: (relevance assessments); count=14,589
- For `docs`, use [`irds/nfcorpus`](https://huggingface.co/datasets/irds/nfcorpus)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/nfcorpus_dev', 'queries')
for record in queries:
record # {'query_id': ..., 'title': ..., 'all': ...}
qrels = load_dataset('irds/nfcorpus_dev', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Boteva2016Nfcorpus,
title="A Full-Text Learning to Rank Dataset for Medical Information Retrieval",
author = "Vera Boteva and Demian Gholipour and Artem Sokolov and Stefan Riezler",
booktitle = "Proceedings of the European Conference on Information Retrieval ({ECIR})",
location = "Padova, Italy",
publisher = "Springer",
year = 2016
}
```
|
open-llm-leaderboard/details_JunchengXie__Starling-LM-7B-alpha-gpt-4-80k | ---
pretty_name: Evaluation run of JunchengXie/Starling-LM-7B-alpha-gpt-4-80k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [JunchengXie/Starling-LM-7B-alpha-gpt-4-80k](https://huggingface.co/JunchengXie/Starling-LM-7B-alpha-gpt-4-80k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JunchengXie__Starling-LM-7B-alpha-gpt-4-80k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-29T21:10:03.480531](https://huggingface.co/datasets/open-llm-leaderboard/details_JunchengXie__Starling-LM-7B-alpha-gpt-4-80k/blob/main/results_2024-03-29T21-10-03.480531.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6427400931197095,\n\
\ \"acc_stderr\": 0.03222503970792139,\n \"acc_norm\": 0.6448752594792302,\n\
\ \"acc_norm_stderr\": 0.03287014856411763,\n \"mc1\": 0.3769889840881273,\n\
\ \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.5435341327101476,\n\
\ \"mc2_stderr\": 0.015323454299145556\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5930034129692833,\n \"acc_stderr\": 0.014356399418009126,\n\
\ \"acc_norm\": 0.6296928327645052,\n \"acc_norm_stderr\": 0.01411129875167495\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6110336586337383,\n\
\ \"acc_stderr\": 0.004865193237024046,\n \"acc_norm\": 0.8127862975502887,\n\
\ \"acc_norm_stderr\": 0.0038928576150164744\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4312169312169312,\n \"acc_stderr\": 0.025506481698138215,\n \"\
acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.025506481698138215\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n\
\ \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n\
\ \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723285,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723285\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586808,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586808\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.47685185185185186,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n\
\ \"acc_stderr\": 0.03036037971029195,\n \"acc_norm\": 0.7130044843049327,\n\
\ \"acc_norm_stderr\": 0.03036037971029195\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"\
acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3675977653631285,\n\
\ \"acc_stderr\": 0.01612554382355294,\n \"acc_norm\": 0.3675977653631285,\n\
\ \"acc_norm_stderr\": 0.01612554382355294\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958143,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958143\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829714,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829714\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4556714471968709,\n\
\ \"acc_stderr\": 0.012719949543032207,\n \"acc_norm\": 0.4556714471968709,\n\
\ \"acc_norm_stderr\": 0.012719949543032207\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3769889840881273,\n\
\ \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.5435341327101476,\n\
\ \"mc2_stderr\": 0.015323454299145556\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7671665351223362,\n \"acc_stderr\": 0.011878201073856546\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6307808946171342,\n \
\ \"acc_stderr\": 0.013293019538066244\n }\n}\n```"
repo_url: https://huggingface.co/JunchengXie/Starling-LM-7B-alpha-gpt-4-80k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|arc:challenge|25_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|gsm8k|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hellaswag|10_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|winogrande|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-29T21-10-03.480531.parquet'
- config_name: results
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- results_2024-03-29T21-10-03.480531.parquet
- split: latest
path:
- results_2024-03-29T21-10-03.480531.parquet
---
# Dataset Card for Evaluation run of JunchengXie/Starling-LM-7B-alpha-gpt-4-80k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [JunchengXie/Starling-LM-7B-alpha-gpt-4-80k](https://huggingface.co/JunchengXie/Starling-LM-7B-alpha-gpt-4-80k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_JunchengXie__Starling-LM-7B-alpha-gpt-4-80k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-29T21:10:03.480531](https://huggingface.co/datasets/open-llm-leaderboard/details_JunchengXie__Starling-LM-7B-alpha-gpt-4-80k/blob/main/results_2024-03-29T21-10-03.480531.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6427400931197095,
"acc_stderr": 0.03222503970792139,
"acc_norm": 0.6448752594792302,
"acc_norm_stderr": 0.03287014856411763,
"mc1": 0.3769889840881273,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.5435341327101476,
"mc2_stderr": 0.015323454299145556
},
"harness|arc:challenge|25": {
"acc": 0.5930034129692833,
"acc_stderr": 0.014356399418009126,
"acc_norm": 0.6296928327645052,
"acc_norm_stderr": 0.01411129875167495
},
"harness|hellaswag|10": {
"acc": 0.6110336586337383,
"acc_stderr": 0.004865193237024046,
"acc_norm": 0.8127862975502887,
"acc_norm_stderr": 0.0038928576150164744
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.025506481698138215,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.025506481698138215
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.044698818540726076,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.044698818540726076
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723285,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723285
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586808,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586808
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644237,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.03036037971029195,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.03036037971029195
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8347107438016529,
"acc_stderr": 0.03390780612972776,
"acc_norm": 0.8347107438016529,
"acc_norm_stderr": 0.03390780612972776
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.046840993210771065,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.046840993210771065
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3675977653631285,
"acc_stderr": 0.01612554382355294,
"acc_norm": 0.3675977653631285,
"acc_norm_stderr": 0.01612554382355294
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958143,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958143
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829714,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829714
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4556714471968709,
"acc_stderr": 0.012719949543032207,
"acc_norm": 0.4556714471968709,
"acc_norm_stderr": 0.012719949543032207
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3769889840881273,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.5435341327101476,
"mc2_stderr": 0.015323454299145556
},
"harness|winogrande|5": {
"acc": 0.7671665351223362,
"acc_stderr": 0.011878201073856546
},
"harness|gsm8k|5": {
"acc": 0.6307808946171342,
"acc_stderr": 0.013293019538066244
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_Changgil__K2S3-Mistral-7bx2-48layers_v1.2 | ---
pretty_name: Evaluation run of Changgil/K2S3-Mistral-7bx2-48layers_v1.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Changgil/K2S3-Mistral-7bx2-48layers_v1.2](https://huggingface.co/Changgil/K2S3-Mistral-7bx2-48layers_v1.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Changgil__K2S3-Mistral-7bx2-48layers_v1.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-29T19:59:03.273138](https://huggingface.co/datasets/open-llm-leaderboard/details_Changgil__K2S3-Mistral-7bx2-48layers_v1.2/blob/main/results_2024-03-29T19-59-03.273138.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.579467606046818,\n\
\ \"acc_stderr\": 0.033372240125975824,\n \"acc_norm\": 0.5822000586410329,\n\
\ \"acc_norm_stderr\": 0.034046612878066436,\n \"mc1\": 0.3023255813953488,\n\
\ \"mc1_stderr\": 0.016077509266133026,\n \"mc2\": 0.4612111840396961,\n\
\ \"mc2_stderr\": 0.015189287391161425\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5179180887372014,\n \"acc_stderr\": 0.014602005585490978,\n\
\ \"acc_norm\": 0.5631399317406144,\n \"acc_norm_stderr\": 0.014494421584256524\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5812587134037045,\n\
\ \"acc_stderr\": 0.0049234456278615165,\n \"acc_norm\": 0.7783310097590121,\n\
\ \"acc_norm_stderr\": 0.004145206350032319\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6377358490566037,\n \"acc_stderr\": 0.029582245128384303,\n\
\ \"acc_norm\": 0.6377358490566037,\n \"acc_norm_stderr\": 0.029582245128384303\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.5375722543352601,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207763,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207763\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4723404255319149,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.4723404255319149,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.04372748290278006,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.04372748290278006\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.38095238095238093,\n \"acc_stderr\": 0.025010749116137602,\n \"\
acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.025010749116137602\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.043435254289490965,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.043435254289490965\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7225806451612903,\n\
\ \"acc_stderr\": 0.025470196835900055,\n \"acc_norm\": 0.7225806451612903,\n\
\ \"acc_norm_stderr\": 0.025470196835900055\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43842364532019706,\n \"acc_stderr\": 0.03491207857486518,\n\
\ \"acc_norm\": 0.43842364532019706,\n \"acc_norm_stderr\": 0.03491207857486518\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7222222222222222,\n \"acc_stderr\": 0.03191178226713547,\n \"\
acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03191178226713547\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.028408953626245282,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.028408953626245282\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6051282051282051,\n \"acc_stderr\": 0.0247843169421564,\n \
\ \"acc_norm\": 0.6051282051282051,\n \"acc_norm_stderr\": 0.0247843169421564\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066482,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066482\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5756302521008403,\n \"acc_stderr\": 0.032104790510157764,\n\
\ \"acc_norm\": 0.5756302521008403,\n \"acc_norm_stderr\": 0.032104790510157764\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242741,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242741\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7706422018348624,\n \"acc_stderr\": 0.018025349724618684,\n \"\
acc_norm\": 0.7706422018348624,\n \"acc_norm_stderr\": 0.018025349724618684\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849313,\n \"\
acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849313\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6503067484662577,\n \"acc_stderr\": 0.03746668325470021,\n\
\ \"acc_norm\": 0.6503067484662577,\n \"acc_norm_stderr\": 0.03746668325470021\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n\
\ \"acc_stderr\": 0.02416161812798774,\n \"acc_norm\": 0.8376068376068376,\n\
\ \"acc_norm_stderr\": 0.02416161812798774\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7662835249042146,\n\
\ \"acc_stderr\": 0.015133383278988836,\n \"acc_norm\": 0.7662835249042146,\n\
\ \"acc_norm_stderr\": 0.015133383278988836\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.661849710982659,\n \"acc_stderr\": 0.02546977014940017,\n\
\ \"acc_norm\": 0.661849710982659,\n \"acc_norm_stderr\": 0.02546977014940017\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.32513966480446926,\n\
\ \"acc_stderr\": 0.015666542785053562,\n \"acc_norm\": 0.32513966480446926,\n\
\ \"acc_norm_stderr\": 0.015666542785053562\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.02736359328468497,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.02736359328468497\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n\
\ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n\
\ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6512345679012346,\n \"acc_stderr\": 0.026517597724465013,\n\
\ \"acc_norm\": 0.6512345679012346,\n \"acc_norm_stderr\": 0.026517597724465013\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4148936170212766,\n \"acc_stderr\": 0.029392236584612503,\n \
\ \"acc_norm\": 0.4148936170212766,\n \"acc_norm_stderr\": 0.029392236584612503\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.409387222946545,\n\
\ \"acc_stderr\": 0.012558780895570753,\n \"acc_norm\": 0.409387222946545,\n\
\ \"acc_norm_stderr\": 0.012558780895570753\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5477941176470589,\n \"acc_stderr\": 0.03023375855159645,\n\
\ \"acc_norm\": 0.5477941176470589,\n \"acc_norm_stderr\": 0.03023375855159645\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5686274509803921,\n \"acc_stderr\": 0.02003639376835263,\n \
\ \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.02003639376835263\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6693877551020408,\n \"acc_stderr\": 0.030116426296540606,\n\
\ \"acc_norm\": 0.6693877551020408,\n \"acc_norm_stderr\": 0.030116426296540606\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786848,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786848\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368036,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368036\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3023255813953488,\n\
\ \"mc1_stderr\": 0.016077509266133026,\n \"mc2\": 0.4612111840396961,\n\
\ \"mc2_stderr\": 0.015189287391161425\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7419100236779794,\n \"acc_stderr\": 0.012298278833972385\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4981046247156937,\n \
\ \"acc_stderr\": 0.013772385765569753\n }\n}\n```"
repo_url: https://huggingface.co/Changgil/K2S3-Mistral-7bx2-48layers_v1.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|arc:challenge|25_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|gsm8k|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hellaswag|10_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T19-59-03.273138.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T19-59-03.273138.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- '**/details_harness|winogrande|5_2024-03-29T19-59-03.273138.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-29T19-59-03.273138.parquet'
- config_name: results
data_files:
- split: 2024_03_29T19_59_03.273138
path:
- results_2024-03-29T19-59-03.273138.parquet
- split: latest
path:
- results_2024-03-29T19-59-03.273138.parquet
---
# Dataset Card for Evaluation run of Changgil/K2S3-Mistral-7bx2-48layers_v1.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Changgil/K2S3-Mistral-7bx2-48layers_v1.2](https://huggingface.co/Changgil/K2S3-Mistral-7bx2-48layers_v1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Changgil__K2S3-Mistral-7bx2-48layers_v1.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-29T19:59:03.273138](https://huggingface.co/datasets/open-llm-leaderboard/details_Changgil__K2S3-Mistral-7bx2-48layers_v1.2/blob/main/results_2024-03-29T19-59-03.273138.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.579467606046818,
"acc_stderr": 0.033372240125975824,
"acc_norm": 0.5822000586410329,
"acc_norm_stderr": 0.034046612878066436,
"mc1": 0.3023255813953488,
"mc1_stderr": 0.016077509266133026,
"mc2": 0.4612111840396961,
"mc2_stderr": 0.015189287391161425
},
"harness|arc:challenge|25": {
"acc": 0.5179180887372014,
"acc_stderr": 0.014602005585490978,
"acc_norm": 0.5631399317406144,
"acc_norm_stderr": 0.014494421584256524
},
"harness|hellaswag|10": {
"acc": 0.5812587134037045,
"acc_stderr": 0.0049234456278615165,
"acc_norm": 0.7783310097590121,
"acc_norm_stderr": 0.004145206350032319
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6377358490566037,
"acc_stderr": 0.029582245128384303,
"acc_norm": 0.6377358490566037,
"acc_norm_stderr": 0.029582245128384303
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207763,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207763
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4723404255319149,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.4723404255319149,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.04372748290278006,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.04372748290278006
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.025010749116137602,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.025010749116137602
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.043435254289490965,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.043435254289490965
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7225806451612903,
"acc_stderr": 0.025470196835900055,
"acc_norm": 0.7225806451612903,
"acc_norm_stderr": 0.025470196835900055
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43842364532019706,
"acc_stderr": 0.03491207857486518,
"acc_norm": 0.43842364532019706,
"acc_norm_stderr": 0.03491207857486518
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03191178226713547,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03191178226713547
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.028408953626245282,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.028408953626245282
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6051282051282051,
"acc_stderr": 0.0247843169421564,
"acc_norm": 0.6051282051282051,
"acc_norm_stderr": 0.0247843169421564
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066482,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066482
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5756302521008403,
"acc_stderr": 0.032104790510157764,
"acc_norm": 0.5756302521008403,
"acc_norm_stderr": 0.032104790510157764
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242741,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242741
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7706422018348624,
"acc_stderr": 0.018025349724618684,
"acc_norm": 0.7706422018348624,
"acc_norm_stderr": 0.018025349724618684
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849313,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849313
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6503067484662577,
"acc_stderr": 0.03746668325470021,
"acc_norm": 0.6503067484662577,
"acc_norm_stderr": 0.03746668325470021
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8376068376068376,
"acc_stderr": 0.02416161812798774,
"acc_norm": 0.8376068376068376,
"acc_norm_stderr": 0.02416161812798774
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7662835249042146,
"acc_stderr": 0.015133383278988836,
"acc_norm": 0.7662835249042146,
"acc_norm_stderr": 0.015133383278988836
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.661849710982659,
"acc_stderr": 0.02546977014940017,
"acc_norm": 0.661849710982659,
"acc_norm_stderr": 0.02546977014940017
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.32513966480446926,
"acc_stderr": 0.015666542785053562,
"acc_norm": 0.32513966480446926,
"acc_norm_stderr": 0.015666542785053562
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.02736359328468497,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.02736359328468497
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6512345679012346,
"acc_stderr": 0.026517597724465013,
"acc_norm": 0.6512345679012346,
"acc_norm_stderr": 0.026517597724465013
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4148936170212766,
"acc_stderr": 0.029392236584612503,
"acc_norm": 0.4148936170212766,
"acc_norm_stderr": 0.029392236584612503
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.409387222946545,
"acc_stderr": 0.012558780895570753,
"acc_norm": 0.409387222946545,
"acc_norm_stderr": 0.012558780895570753
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5477941176470589,
"acc_stderr": 0.03023375855159645,
"acc_norm": 0.5477941176470589,
"acc_norm_stderr": 0.03023375855159645
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.02003639376835263,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.02003639376835263
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6693877551020408,
"acc_stderr": 0.030116426296540606,
"acc_norm": 0.6693877551020408,
"acc_norm_stderr": 0.030116426296540606
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786848,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786848
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368036,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368036
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3023255813953488,
"mc1_stderr": 0.016077509266133026,
"mc2": 0.4612111840396961,
"mc2_stderr": 0.015189287391161425
},
"harness|winogrande|5": {
"acc": 0.7419100236779794,
"acc_stderr": 0.012298278833972385
},
"harness|gsm8k|5": {
"acc": 0.4981046247156937,
"acc_stderr": 0.013772385765569753
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
laaaarrywang/testing | ---
license: apache-2.0
---
|
CyberHarem/aponia_honkai3 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of aponia (Houkai 3rd)
This is the dataset of aponia (Houkai 3rd), containing 320 images and their tags.
The core tags of this character are `long_hair, breasts, bangs, hair_between_eyes, large_breasts, brown_hair, blue_eyes, mole_under_eye, mole, long_bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 320 | 613.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aponia_honkai3/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 320 | 284.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aponia_honkai3/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 795 | 608.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aponia_honkai3/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 320 | 509.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aponia_honkai3/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 795 | 960.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aponia_honkai3/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/aponia_honkai3',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, closed_mouth, looking_at_viewer, nun, solo, upper_body, veil, black_dress, simple_background, white_background, long_sleeves |
| 1 | 13 |  |  |  |  |  | 1girl, black_dress, long_sleeves, nun, solo, veil, butterfly_wings, closed_mouth, looking_at_viewer, blonde_hair, breast_curtains, pelvic_curtain, red_eyes, blue_butterfly, full_body, thigh_strap, thighs |
| 2 | 5 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, closed_mouth, looking_at_viewer, solo, sun_hat, white_dress, white_headwear, simple_background, choker, white_background, blonde_hair, criss-cross_halter, flower, purple_eyes, smile, upper_body |
| 3 | 6 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, smile, solo, sun_hat, white_dress, white_headwear, blue_butterfly, looking_at_viewer, choker, closed_mouth |
| 4 | 7 |  |  |  |  |  | 1girl, black_dress, hetero, solo_focus, veil, 1boy, blush, long_sleeves, looking_at_viewer, nun, penis, mosaic_censoring, nipples, purple_eyes, pussy, sex, vaginal, smile, thigh_strap |
| 5 | 6 |  |  |  |  |  | bare_shoulders, black_gloves, closed_mouth, hair_ornament, holding_clipboard, 1girl, cleavage_cutout, looking_at_viewer, smile, solo, white_dress, white_thighhighs, black_dress, elbow_gloves, black_headwear, detached_sleeves, hat, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | closed_mouth | looking_at_viewer | nun | solo | upper_body | veil | black_dress | simple_background | white_background | long_sleeves | butterfly_wings | blonde_hair | breast_curtains | pelvic_curtain | red_eyes | blue_butterfly | full_body | thigh_strap | thighs | bare_shoulders | cleavage | sun_hat | white_dress | white_headwear | choker | criss-cross_halter | flower | purple_eyes | smile | hetero | solo_focus | 1boy | blush | penis | mosaic_censoring | nipples | pussy | sex | vaginal | black_gloves | hair_ornament | holding_clipboard | cleavage_cutout | white_thighhighs | elbow_gloves | black_headwear | detached_sleeves | hat |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------------------|:------|:-------|:-------------|:-------|:--------------|:--------------------|:-------------------|:---------------|:------------------|:--------------|:------------------|:-----------------|:-----------|:-----------------|:------------|:--------------|:---------|:-----------------|:-----------|:----------|:--------------|:-----------------|:---------|:---------------------|:---------|:--------------|:--------|:---------|:-------------|:-------|:--------|:--------|:-------------------|:----------|:--------|:------|:----------|:---------------|:----------------|:--------------------|:------------------|:-------------------|:---------------|:-----------------|:-------------------|:------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 13 |  |  |  |  |  | X | X | X | X | X | | X | X | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | | X | X | | | X | X | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | X | | X | | | | | | | | | | | | X | | | | X | X | X | X | X | X | | | | X | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | X | X | | | X | X | | | X | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | X | X | | X | | | X | X | X | | | | | | | | | | | X | | | X | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X |
|
msr_text_compression | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- other
license_details: Microsoft Research Data License Agreement
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- extended|other-Open-American-National-Corpus-(OANC1)
task_categories:
- summarization
task_ids: []
pretty_name: MsrTextCompression
dataset_info:
features:
- name: source_id
dtype: string
- name: domain
dtype: string
- name: source_text
dtype: string
- name: targets
sequence:
- name: compressed_text
dtype: string
- name: judge_id
dtype: string
- name: num_ratings
dtype: int64
- name: ratings
sequence: int64
splits:
- name: train
num_bytes: 5001312
num_examples: 4936
- name: validation
num_bytes: 449691
num_examples: 447
- name: test
num_bytes: 804536
num_examples: 785
download_size: 0
dataset_size: 6255539
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://msropendata.com/datasets/f8ce2ec9-7fbd-48f7-a8bb-2d2279373563
- **Repository:**
- **Paper:** https://www.microsoft.com/en-us/research/wp-content/uploads/2016/09/Sentence_Compression_final-1.pdf
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains sentences and short paragraphs with corresponding shorter (compressed) versions. There are up to five compressions for each input text, together with quality judgements of their meaning preservation and grammaticality. The dataset is derived using source texts from the Open American National Corpus (ww.anc.org) and crowd-sourcing.
### Supported Tasks and Leaderboards
Text Summarization
### Languages
English
## Dataset Structure
### Data Instances
It contains approximately 6,000 source texts with multiple compressions (about 26,000 pairs of source and compressed texts), representing business letters, newswire, journals, and technical documents sampled from the Open American National Corpus (OANC1).
- Each source text is accompanied by up to five crowd-sourced rewrites constrained to a preset
compression ratio and annotated with quality judgments. Multiple rewrites permit study of the impact of operations on human compression quality and facilitate automatic evaluation.
- This dataset is the first to provide compressions at the multi-sentence (two-sentence paragraph)
level, which may present a stepping stone to whole document summarization.
- Many of these two-sentence paragraphs are compressed both as paragraphs and separately sentence-bysentence, offering data that may yield insights
into the impact of multi-sentence operations on human compression quality.
| Description | Source | Target | Average CPS | Meaning Quality | Grammar Quality |
| :------------- | :----------: | -----------: | -----------: | -----------: | -----------: |
| 1-Sentence | 3764 | 15523 | 4.12 | 2.78 | 2.81 |
| 2-Sentence | 2405 | 10900 | 4.53 | 2.78 | 2.83 |
**Note**: Average CPS = Average Compressions per Source Text
### Data Fields
```
{'domain': 'Newswire',
'source_id': '106',
'source_text': '" Except for this small vocal minority, we have just not gotten a lot of groundswell against this from members, " says APA president Philip G. Zimbardo of Stanford University.',
'targets': {'compressed_text': ['"Except for this small vocal minority, we have not gotten a lot of groundswell against this," says APA president Zimbardo.',
'"Except for a vocal minority, we haven\'t gotten much groundswell from members, " says Philip G. Zimbardo of Stanford University.',
'APA president of Stanford has stated that except for a vocal minority they have not gotten a lot of pushback from members.',
'APA president Philip G. Zimbardo of Stanford says they have not had much opposition against this.'],
'judge_id': ['2', '22', '10', '0'],
'num_ratings': [3, 3, 3, 3],
'ratings': [[6, 6, 6], [11, 6, 6], [6, 11, 6], [6, 11, 11]]}}
```
- source_id: index of article per original dataset
- source_text: uncompressed original text
- domain: source of the article
- targets:
- compressed_text: compressed version of `source_text`
- judge_id: anonymized ids of crowdworkers who proposed compression
- num_ratings: number of ratings available for each proposed compression
- ratings: see table below
Ratings system (excerpted from authors' README):
- 6 = Most important meaning Flawless language (3 on meaning and 3 on grammar as per the paper's terminology)
- 7 = Most important meaning Minor errors (3 on meaning and 2 on grammar)
- 9 = Most important meaning Disfluent or incomprehensible (3 on meaning and 1 on grammar)
- 11 = Much meaning Flawless language (2 on meaning and 3 on grammar)
- 12 = Much meaning Minor errors (2 on meaning and 2 on grammar)
- 14 = Much meaning Disfluent or incomprehensible (2 on meaning and 1 on grammar)
- 21 = Little or none meaning Flawless language (1 on meaning and 3 on grammar)
- 22 = Little or none meaning Minor errors (1 on meaning and 2 on grammar)
- 24 = Little or none meaning Disfluent or incomprehensible (1 on meaning and 1 on grammar)
See **README.txt** from data archive for additional details.
### Data Splits
There are 4,936 source texts in the training, 448 in the development, and 785 in the test set.
## Dataset Creation
### Annotations
#### Annotation process
Compressions were created using UHRS, an inhouse crowd-sourcing system similar to Amazon’s Mechanical Turk, in two annotation rounds, one for shortening and a second to rate compression quality:
1. In the first round, five workers were tasked with abridging each source text by at least 25%, while remaining grammatical and fluent, and retaining the meaning of the original.
2. In the second round, 3-5 judges (raters) were asked to evaluate the grammaticality of each compression on a scale from 1 (major errors, disfluent) through 3 (fluent), and again analogously for meaning preservation on a scale from 1 (orthogonal) through 3 (most important meaning-preserving).
## Additional Information
### Licensing Information
Microsoft Research Data License Agreement
### Citation Information
@inproceedings{Toutanova2016ADA,
title={A Dataset and Evaluation Metrics for Abstractive Compression of Sentences and Short Paragraphs},
author={Kristina Toutanova and Chris Brockett and Ke M. Tran and Saleema Amershi},
booktitle={EMNLP},
year={2016}
}
### Contributions
Thanks to [@jeromeku](https://github.com/jeromeku) for adding this dataset. |
LukeEuser/docvqa_75_unanswerable_questions | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: query
struct:
- name: de
dtype: string
- name: en
dtype: string
- name: es
dtype: string
- name: fr
dtype: string
- name: it
dtype: string
- name: answers
sequence: string
- name: words
sequence: string
- name: bounding_boxes
sequence:
sequence: float32
length: 4
- name: answer
struct:
- name: match_score
dtype: float64
- name: matched_text
dtype: string
- name: start
dtype: int64
- name: text
dtype: string
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 33128041.0
num_examples: 100
- name: test
num_bytes: 6102508.0
num_examples: 20
download_size: 13283036
dataset_size: 39230549.0
---
# Dataset Card for "docvqa_75_unanswerable_questions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-futin__guess-vi-d44dbe-2087167154 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/guess
eval_info:
task: text_zero_shot_classification
model: bigscience/bloom-560m
metrics: []
dataset_name: futin/guess
dataset_config: vi
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloom-560m
* Dataset: futin/guess
* Config: vi
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
DanielSongShen/CLIP-food101-image-dataset-large-no-image_latents_hidden_states | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': apple_pie
'1': baby_back_ribs
'2': baklava
'3': beef_carpaccio
'4': beef_tartare
'5': beet_salad
'6': beignets
'7': bibimbap
'8': bread_pudding
'9': breakfast_burrito
'10': bruschetta
'11': caesar_salad
'12': cannoli
'13': caprese_salad
'14': carrot_cake
'15': ceviche
'16': cheesecake
'17': cheese_plate
'18': chicken_curry
'19': chicken_quesadilla
'20': chicken_wings
'21': chocolate_cake
'22': chocolate_mousse
'23': churros
'24': clam_chowder
'25': club_sandwich
'26': crab_cakes
'27': creme_brulee
'28': croque_madame
'29': cup_cakes
'30': deviled_eggs
'31': donuts
'32': dumplings
'33': edamame
'34': eggs_benedict
'35': escargots
'36': falafel
'37': filet_mignon
'38': fish_and_chips
'39': foie_gras
'40': french_fries
'41': french_onion_soup
'42': french_toast
'43': fried_calamari
'44': fried_rice
'45': frozen_yogurt
'46': garlic_bread
'47': gnocchi
'48': greek_salad
'49': grilled_cheese_sandwich
'50': grilled_salmon
'51': guacamole
'52': gyoza
'53': hamburger
'54': hot_and_sour_soup
'55': hot_dog
'56': huevos_rancheros
'57': hummus
'58': ice_cream
'59': lasagna
'60': lobster_bisque
'61': lobster_roll_sandwich
'62': macaroni_and_cheese
'63': macarons
'64': miso_soup
'65': mussels
'66': nachos
'67': omelette
'68': onion_rings
'69': oysters
'70': pad_thai
'71': paella
'72': pancakes
'73': panna_cotta
'74': peking_duck
'75': pho
'76': pizza
'77': pork_chop
'78': poutine
'79': prime_rib
'80': pulled_pork_sandwich
'81': ramen
'82': ravioli
'83': red_velvet_cake
'84': risotto
'85': samosa
'86': sashimi
'87': scallops
'88': seaweed_salad
'89': shrimp_and_grits
'90': spaghetti_bolognese
'91': spaghetti_carbonara
'92': spring_rolls
'93': steak
'94': strawberry_shortcake
'95': sushi
'96': tacos
'97': takoyaki
'98': tiramisu
'99': tuna_tartare
'100': waffles
splits:
- name: train
num_bytes: 3842657187.0
num_examples: 75750
download_size: 3798519475
dataset_size: 3842657187.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mskov/ESC50 | ---
license: cc
dataset_info:
features:
- name: filename
dtype: string
- name: fold
dtype: int64
- name: target
dtype: int64
- name: category
dtype: string
- name: esc10
dtype: bool
- name: src_file
dtype: int64
- name: take
dtype: string
- name: audio
dtype: audio
struct:
- name: bytes
dtype: binary
- name: path
dtype: string
splits:
- name: train
num_bytes: 441114557
num_examples: 1000
- name: test
num_bytes: 441115029
num_examples: 1000
download_size: 773323386
dataset_size: 882229586
---
|
AkikoOu/hqz-beijingoperaface | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 68608568.0
num_examples: 199
download_size: 68599282
dataset_size: 68608568.0
---
# Dataset Card for "hqz-beijingoperaface"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MJannik/240221-logo-caption-02 | ---
license: unknown
---
|
embedding-data/PAQ_pairs | ---
license: mit
language:
- en
paperswithcode_id: embedding-data/PAQ_pairs
pretty_name: PAQ_pairs
task_categories:
- sentence-similarity
- paraphrase-mining
task_ids:
- semantic-similarity-classification
---
# Dataset Card for "PAQ_pairs"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://github.com/facebookresearch/PAQ](https://github.com/facebookresearch/PAQ)
- **Repository:** [More Information Needed](https://github.com/facebookresearch/PAQ)
- **Paper:** [More Information Needed](https://github.com/facebookresearch/PAQ)
- **Point of Contact:** [More Information Needed](https://github.com/facebookresearch/PAQ)
- **Size of downloaded dataset files:**
- **Size of the generated dataset:**
- **Total amount of disk used:** 21 Bytes
### Dataset Summary
Pairs questions and answers obtained from Wikipedia.
Disclaimer: The team releasing PAQ QA pairs did not upload the dataset to the Hub and did not write a dataset card.
These steps were done by the Hugging Face team.
### Supported Tasks
- [Sentence Transformers](https://huggingface.co/sentence-transformers) training; useful for semantic search and sentence similarity.
### Languages
- English.
## Dataset Structure
Each example in the dataset contains pairs of sentences and is formatted as a dictionary with the key "set" and a list with the sentences as "value". The first sentence is a question and the second an answer; thus, both sentences would be similar.
```
{"set": [sentence_1, sentence_2]}
{"set": [sentence_1, sentence_2]}
...
{"set": [sentence_1, sentence_2]}
```
This dataset is useful for training Sentence Transformers models. Refer to the following post on how to train models using similar pairs of sentences.
### Usage Example
Install the 🤗 Datasets library with `pip install datasets` and load the dataset from the Hub with:
```python
from datasets import load_dataset
dataset = load_dataset("embedding-data/PAQ_pairs")
```
The dataset is loaded as a `DatasetDict` and has the format:
```python
DatasetDict({
train: Dataset({
features: ['set'],
num_rows: 64371441
})
})
```
Review an example `i` with:
```python
dataset["train"][i]["set"]
```
### Data Instances
[More Information Needed](https://github.com/facebookresearch/PAQ)
### Data Fields
[More Information Needed](https://github.com/facebookresearch/PAQ)
### Data Splits
[More Information Needed](https://github.com/facebookresearch/PAQ)
## Dataset Creation
[More Information Needed](https://github.com/facebookresearch/PAQ)
### Curation Rationale
[More Information Needed](https://github.com/facebookresearch/PAQ)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/facebookresearch/PAQ)
#### Who are the source language producers?
[More Information Needed](https://github.com/facebookresearch/PAQ)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/facebookresearch/PAQ)
#### Who are the annotators?
[More Information Needed](https://github.com/facebookresearch/PAQ)
### Personal and Sensitive Information
[More Information Needed](https://github.com/facebookresearch/PAQ)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/facebookresearch/PAQ)
### Discussion of Biases
[More Information Needed](https://github.com/facebookresearch/PAQ)
### Other Known Limitations
[More Information Needed](https://github.com/facebookresearch/PAQ)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/facebookresearch/PAQ)
### Licensing Information
The PAQ QA-pairs and metadata is licensed under [CC-BY-SA](https://creativecommons.org/licenses/by-sa/3.0/).
Other data is licensed according to the accompanying license files.
### Citation Information
```
@article{lewis2021paq,
title={PAQ: 65 Million Probably-Asked Questions and What You Can Do With Them},
author={Patrick Lewis and Yuxiang Wu and Linqing Liu and Pasquale Minervini and Heinrich Küttler and Aleksandra Piktus and Pontus Stenetorp and Sebastian Riedel},
year={2021},
eprint={2102.07033},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Contributions
Thanks to [@patrick-s-h-lewis](https://github.com/patrick-s-h-lewis) for adding this dataset.
|
qywu/ruozhiba_en | ---
dataset_info:
features:
- name: source
dtype: string
- name: instruction
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: followup_question
dtype: string
- name: model
dtype: string
splits:
- name: train_sft
num_bytes: 954797
num_examples: 238
download_size: 548182
dataset_size: 954797
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
size_categories:
- n<1K
---
# Ruozhiba English Data
Based on the findings from [COIG-CQIA](https://arxiv.org/html/2403.18058v1), Ruozhiba data is a high-quality instruction tuning dataset that can greatly improve supervised fine-tuning models' performance.
We translated the 240 instructions in Ruozhiba from Chinese to English.
We filtered out and modified some instructions are language/cultural related.
Some Chinese instructions are kept to maintain their original meaning.
Finally, we re-generate the response using `gpt-4-turbo` and add one additional turn to improve robustness.
## MT-Bench
We use GPT-4-0125-preview as Judge. On MT-Bench, [ruozhiba_en](https://huggingface.co/datasets/qywu/ruozhiba_en) data has achieved comparable performance compared to [ultrachat_200k](https://huggingface.co/datasets/HuggingFaceH4/ultrachat_200k) dataset.
| Model | Total | Coding | Extraction | Humanities | Math | Reasoning | Roleplay | STEM | Writing |
|--------------------------------------------|-------|--------|------------|------------|------|-----------|----------|------|---------|
| alignment-handbook/zephyr-7b-sft-full | 5.6 | 3.95 | 6.75 | 7.5 | 3.1 | 4.05 | 6.15 | 6.1 | 7.2 |
| zephyr-7b-sft-ruozhiba | 5.88 | 3.75 | 6.45 | 8.11 | 2.7 | 4.2 | 7.4 | 7.4 | 7.15 | |
adalib/pandas-cond-gen-sub-10 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
- name: api
dtype: string
splits:
- name: train
num_bytes: 6363487
num_examples: 463
download_size: 2178087
dataset_size: 6363487
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ajsbsd/nvidia-qa | ---
license: bsd
---
Nvidia Documentation Question and Answer pairs
Q&A dataset for LLM finetuning about the NVIDIA about SDKs and blogs
This dataset is obtained by generating Q&A pairs from a few NVIDIA websites such as development kits and guides. This data can be used to fine-tune any LLM for indulging knowledge about NVIDIA into them.
Source: https://www.kaggle.com/datasets/gondimalladeepesh/nvidia-documentation-question-and-answer-pairs |
evanfrick/lichess | ---
license: mit
---
|
Vinnyyw/Anahivoices | ---
license: openrail
---
|
majedk01/english-arabic-text | ---
dataset_info:
features:
- name: translation
struct:
- name: ar
dtype: string
- name: en
dtype: string
splits:
- name: train
num_bytes: 4585416789
num_examples: 11349997
download_size: 2149868789
dataset_size: 4585416789
---
# Dataset Card for "english-arabic-text"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ym0v0my/Time_series_dataset | ---
task_categories:
- time-series-forecasting
---
include six common Time-series-forcasting dataset
* ETTsmall
- ETTh1
- ETTh2
- ETTm1
- ETTm2
* traffic
* eletricity
* illness
* exchange_rate |
falba/tokenized_news_2 | ---
dataset_info:
features:
- name: document
dtype: string
- name: summary
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 860380944
num_examples: 44972
- name: validation
num_bytes: 105896194
num_examples: 5622
- name: test
num_bytes: 107677911
num_examples: 5622
download_size: 536056058
dataset_size: 1073955049
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
TankuVie/ted_talks_vi_it_parallel_corpus | ---
license: other
---
|
Francesco/currency-v4f8j | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int32
- name: height
dtype: int32
- name: objects
sequence:
- name: id
dtype: int64
- name: area
dtype: int64
- name: bbox
sequence: float32
length: 4
- name: category
dtype:
class_label:
names:
'0': currency
'1': Dime
'2': Nickel
'3': Penny
'4': Quarter
'5': fifty
'6': five
'7': hundred
'8': one
'9': ten
'10': twenty
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- object-detection
task_ids: []
pretty_name: currency-v4f8j
tags:
- rf100
---
# Dataset Card for currency-v4f8j
** The original COCO dataset is stored at `dataset.tar.gz`**
## Dataset Description
- **Homepage:** https://universe.roboflow.com/object-detection/currency-v4f8j
- **Point of Contact:** francesco.zuppichini@gmail.com
### Dataset Summary
currency-v4f8j
### Supported Tasks and Leaderboards
- `object-detection`: The dataset can be used to train a model for Object Detection.
### Languages
English
## Dataset Structure
### Data Instances
A data point comprises an image and its object annotations.
```
{
'image_id': 15,
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=640x640 at 0x2373B065C18>,
'width': 964043,
'height': 640,
'objects': {
'id': [114, 115, 116, 117],
'area': [3796, 1596, 152768, 81002],
'bbox': [
[302.0, 109.0, 73.0, 52.0],
[810.0, 100.0, 57.0, 28.0],
[160.0, 31.0, 248.0, 616.0],
[741.0, 68.0, 202.0, 401.0]
],
'category': [4, 4, 0, 0]
}
}
```
### Data Fields
- `image`: the image id
- `image`: `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `width`: the image width
- `height`: the image height
- `objects`: a dictionary containing bounding box metadata for the objects present on the image
- `id`: the annotation id
- `area`: the area of the bounding box
- `bbox`: the object's bounding box (in the [coco](https://albumentations.ai/docs/getting_started/bounding_boxes_augmentation/#coco) format)
- `category`: the object's category.
#### Who are the annotators?
Annotators are Roboflow users
## Additional Information
### Licensing Information
See original homepage https://universe.roboflow.com/object-detection/currency-v4f8j
### Citation Information
```
@misc{ currency-v4f8j,
title = { currency v4f8j Dataset },
type = { Open Source Dataset },
author = { Roboflow 100 },
howpublished = { \url{ https://universe.roboflow.com/object-detection/currency-v4f8j } },
url = { https://universe.roboflow.com/object-detection/currency-v4f8j },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { nov },
note = { visited on 2023-03-29 },
}"
```
### Contributions
Thanks to [@mariosasko](https://github.com/mariosasko) for adding this dataset. |
C-MTEB/DuRetrieval | ---
configs:
- config_name: default
data_files:
- split: corpus
path: data/corpus-*
- split: queries
path: data/queries-*
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
splits:
- name: corpus
num_bytes: 91213303
num_examples: 100001
- name: queries
num_bytes: 131354
num_examples: 2000
download_size: 64531170
dataset_size: 91344657
---
# Dataset Card for "DuRetrieval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_rte_double_superlative | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 101921
num_examples: 242
- name: train
num_bytes: 84435
num_examples: 200
download_size: 127840
dataset_size: 186356
---
# Dataset Card for "MULTI_VALUE_rte_double_superlative"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/hoshino_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hoshino/小鳥遊ホシノ/星野 (Blue Archive)
This is the dataset of hoshino/小鳥遊ホシノ/星野 (Blue Archive), containing 500 images and their tags.
The core tags of this character are `pink_hair, ahoge, halo, long_hair, hair_between_eyes, blue_eyes, yellow_eyes, pink_halo, heterochromia, very_long_hair, fang`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 960.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hoshino_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 779.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hoshino_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1366 | 1.62 GiB | [Download](https://huggingface.co/datasets/CyberHarem/hoshino_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hoshino_bluearchive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, black_skirt, blue_necktie, collared_shirt, looking_at_viewer, open_mouth, plaid_skirt, pleated_skirt, white_shirt, black_gloves, fingerless_gloves, solo, blush, simple_background, smile, white_background, one_eye_closed, id_card, chest_harness, puffy_long_sleeves |
| 1 | 5 |  |  |  |  |  | 1girl, black_skirt, chest_harness, collared_shirt, holding_gun, long_sleeves, plaid_skirt, pleated_skirt, solo, white_shirt, black_gloves, blue_necktie, fingerless_gloves, id_card, closed_mouth, holding_shield, looking_at_viewer, parted_lips, shotgun_shell |
| 2 | 9 |  |  |  |  |  | 1girl, black_one-piece_swimsuit, blush, school_swimsuit, short_hair, solo, collarbone, covered_navel, simple_background, looking_at_viewer, small_breasts, white_background, bare_arms, open_mouth, bare_shoulders, blue_one-piece_swimsuit, parted_lips, standing |
| 3 | 20 |  |  |  |  |  | 1girl, eyewear_on_head, frilled_bikini, looking_at_viewer, low_twintails, official_alternate_costume, open_mouth, solo, sunglasses, white_bikini, blue_jacket, blush, collarbone, hair_bobbles, navel, open_jacket, long_sleeves, off_shoulder, small_breasts, outdoors, stomach, day, ocean, bare_shoulders, blue-tinted_eyewear, beach, blue_sky, cloud, inflatable_toy, :d, white-framed_eyewear |
| 4 | 9 |  |  |  |  |  | 1girl, solo, bare_shoulders, blush, collarbone, looking_at_viewer, open_mouth, white_dress, off-shoulder_dress, smile, official_alternate_costume, shoulder_bag, frilled_dress |
| 5 | 5 |  |  |  |  |  | black_dress, blush, enmaided, frilled_apron, maid_apron, open_mouth, smile, white_apron, 1girl, black_footwear, looking_at_viewer, maid_headdress, one_eye_closed, puffy_sleeves, shoes, short_sleeves, white_pantyhose, 2girls, closed_eyes, frilled_dress, garter_straps, indoors, long_sleeves, skin_fang, solo_focus, white_thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_skirt | blue_necktie | collared_shirt | looking_at_viewer | open_mouth | plaid_skirt | pleated_skirt | white_shirt | black_gloves | fingerless_gloves | solo | blush | simple_background | smile | white_background | one_eye_closed | id_card | chest_harness | puffy_long_sleeves | holding_gun | long_sleeves | closed_mouth | holding_shield | parted_lips | shotgun_shell | black_one-piece_swimsuit | school_swimsuit | short_hair | collarbone | covered_navel | small_breasts | bare_arms | bare_shoulders | blue_one-piece_swimsuit | standing | eyewear_on_head | frilled_bikini | low_twintails | official_alternate_costume | sunglasses | white_bikini | blue_jacket | hair_bobbles | navel | open_jacket | off_shoulder | outdoors | stomach | day | ocean | blue-tinted_eyewear | beach | blue_sky | cloud | inflatable_toy | :d | white-framed_eyewear | white_dress | off-shoulder_dress | shoulder_bag | frilled_dress | black_dress | enmaided | frilled_apron | maid_apron | white_apron | black_footwear | maid_headdress | puffy_sleeves | shoes | short_sleeves | white_pantyhose | 2girls | closed_eyes | garter_straps | indoors | skin_fang | solo_focus | white_thighhighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:---------------|:-----------------|:--------------------|:-------------|:--------------|:----------------|:--------------|:---------------|:--------------------|:-------|:--------|:--------------------|:--------|:-------------------|:-----------------|:----------|:----------------|:---------------------|:--------------|:---------------|:---------------|:-----------------|:--------------|:----------------|:---------------------------|:------------------|:-------------|:-------------|:----------------|:----------------|:------------|:-----------------|:--------------------------|:-----------|:------------------|:-----------------|:----------------|:-----------------------------|:-------------|:---------------|:--------------|:---------------|:--------|:--------------|:---------------|:-----------|:----------|:------|:--------|:----------------------|:--------|:-----------|:--------|:-----------------|:-----|:-----------------------|:--------------|:---------------------|:---------------|:----------------|:--------------|:-----------|:----------------|:-------------|:--------------|:-----------------|:-----------------|:----------------|:--------|:----------------|:------------------|:---------|:--------------|:----------------|:----------|:------------|:-------------|:-------------------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | | X | X | X | X | X | X | | | | | | X | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | | | | X | X | | | | | | X | X | X | | X | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 20 |  |  |  |  |  | X | | | | X | X | | | | | | X | X | | | | | | | | | X | | | | | | | | X | | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | | | | X | X | | | | | | X | X | | X | | | | | | | | | | | | | | | X | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | | X | X | | | | | | | X | | X | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
hippocrates/OphthoFreeResponse_train | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 519751
num_examples: 820
- name: valid
num_bytes: 519751
num_examples: 820
- name: test
num_bytes: 64223
num_examples: 103
download_size: 324203
dataset_size: 1103725
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
---
|
hadninede/oasst1_id | ---
license: apache-2.0
dataset_info:
features:
- name: message_id
dtype: string
- name: parent_id
dtype: string
- name: user_id
dtype: string
- name: created_date
dtype: string
- name: text
dtype: string
- name: role
dtype: string
- name: lang
dtype: string
- name: review_count
dtype: int64
- name: review_result
dtype: bool
- name: deleted
dtype: bool
- name: rank
dtype: float64
- name: synthetic
dtype: bool
- name: model_name
dtype: 'null'
- name: detoxify
struct:
- name: identity_attack
dtype: float64
- name: insult
dtype: float64
- name: obscene
dtype: float64
- name: severe_toxicity
dtype: float64
- name: sexual_explicit
dtype: float64
- name: threat
dtype: float64
- name: toxicity
dtype: float64
- name: message_tree_id
dtype: string
- name: tree_state
dtype: string
- name: emojis
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: labels
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: value
sequence: float64
splits:
- name: train
num_bytes: 79183032
num_examples: 81291
- name: validation
num_bytes: 2952264
num_examples: 3001
download_size: 25749085
dataset_size: 82135296
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
task_categories:
- text-generation
language:
- id
pretty_name: Translated OASST1 dataset in Bahasa Indonesia using HelsinkiNLP OPUS models
size_categories:
- 10K<n<100K
---
This is Indonesian version of OASST1 dataset, translated entirely using HelsinkiNLP OPUS models and [llama2lang library](https://github.com/UnderstandLingBV/LLaMa2lang).
Feel free to request another dataset translation into Bahasa Indonesia, i'll try to help.
_Fellow Indonesians, we shall not be left behind in the age of AI._ |
sushei/combined_dataset | ---
license: unknown
---
|
dexaai/huberman_on_exercise | ---
dataset_info:
features:
- name: id
dtype: string
- name: embedding
sequence: float64
- name: metadata
struct:
- name: chunkTitle
dtype: string
- name: episodeTitle
dtype: string
- name: imgUrl
dtype: string
- name: published
dtype: string
- name: url
dtype: string
- name: document
dtype: string
splits:
- name: data
num_bytes: 4507060
num_examples: 293
download_size: 4292011
dataset_size: 4507060
language:
- en
pretty_name: 'Huberman Lab transcripts: Exercise episodes'
size_categories:
- n<1K
license: mit
---
# Dataset Card for "huberman_on_exercise"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Rasu23/iapp_all_mep_iter0_new | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: generated
list:
- name: content
dtype: string
- name: role
dtype: string
- name: real
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 33113417
num_examples: 5761
- name: test
num_bytes: 4181570
num_examples: 739
download_size: 6375834
dataset_size: 37294987
---
# Dataset Card for "iapp_all_mep_iter0_new"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mstz/wine | ---
language:
- en
tags:
- wine
- tabular_classification
- binary_classification
pretty_name: Wine quality
size_categories:
- 1K<n<10K
task_categories:
- tabular-classification
configs:
- wine
license: cc
---
# Wine
The [Wine dataset](https://www.kaggle.com/datasets/ghassenkhaled/wine-quality-data) from Kaggle.
Classify wine as red or white.
# Configurations and tasks
| **Configuration** | **Task** | **Description** |
|-------------------|---------------------------|-----------------------------------------------------------------|
| wine | Binary classification | Is this red wine? |
# Usage
```python
from datasets import load_dataset
dataset = load_dataset("mstz/wine")["train"]
``` |
pgajo/subs-v2 | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 5367257273.704724
num_examples: 79191
- name: test
num_bytes: 579733939.4022752
num_examples: 8800
download_size: 5812185768
dataset_size: 5946991213.106999
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
sasha/pedro-embeddings | ---
dataset_info:
features:
- name: image
dtype: image
- name: embeddings
sequence: float32
splits:
- name: train
num_bytes: 4762817.0
num_examples: 150
download_size: 4945449
dataset_size: 4762817.0
---
# Dataset Card for "pedro-embeddings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
presencesw/GPT_data_good | ---
dataset_info:
features:
- name: index
dtype: int64
- name: topic
dtype: string
- name: context
dtype: string
- name: Evidence
dtype: string
- name: Claim
dtype: string
- name: Label
dtype: string
- name: eval
dtype: float64
splits:
- name: train
num_bytes: 24649356
num_examples: 10820
download_size: 12537449
dataset_size: 24649356
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
doc2dial | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- en
license:
- cc-by-3.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- question-answering
task_ids:
- closed-domain-qa
paperswithcode_id: doc2dial
pretty_name: doc2dial
dataset_info:
- config_name: dialogue_domain
features:
- name: dial_id
dtype: string
- name: doc_id
dtype: string
- name: domain
dtype: string
- name: turns
list:
- name: turn_id
dtype: int32
- name: role
dtype: string
- name: da
dtype: string
- name: references
list:
- name: sp_id
dtype: string
- name: label
dtype: string
- name: utterance
dtype: string
splits:
- name: train
num_bytes: 6924209
num_examples: 3474
- name: validation
num_bytes: 1315815
num_examples: 661
download_size: 5879543
dataset_size: 8240024
- config_name: document_domain
features:
- name: domain
dtype: string
- name: doc_id
dtype: string
- name: title
dtype: string
- name: doc_text
dtype: string
- name: spans
list:
- name: id_sp
dtype: string
- name: tag
dtype: string
- name: start_sp
dtype: int32
- name: end_sp
dtype: int32
- name: text_sp
dtype: string
- name: title
dtype: string
- name: parent_titles
dtype: string
- name: id_sec
dtype: string
- name: start_sec
dtype: int32
- name: text_sec
dtype: string
- name: end_sec
dtype: int32
- name: doc_html_ts
dtype: string
- name: doc_html_raw
dtype: string
splits:
- name: train
num_bytes: 204874908
num_examples: 3416
download_size: 5879543
dataset_size: 204874908
- config_name: doc2dial_rc
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: domain
dtype: string
splits:
- name: validation
num_bytes: 22705288
num_examples: 3972
- name: train
num_bytes: 114778994
num_examples: 20431
download_size: 5879543
dataset_size: 137484282
---
# Dataset Card for doc2dial
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://doc2dial.github.io
- **Repository:** [Needs More Information]
- **Paper:** https://www.aclweb.org/anthology/2020.emnlp-main.652.pdf
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
Doc2dial is dataset of goal-oriented dialogues that are grounded in the associated documents. It includes over 4500 annotated conversations with an average of 14 turns that are grounded in over 450 documents from four domains. Compared to the prior document-grounded dialogue datasets this dataset covers a variety of dialogue scenes in information-seeking conversations.
### Supported Tasks and Leaderboards
> Supported Task: [Shared Task](https://doc2dial.github.io/workshop2021/shared.html) hosted by DialDoc21 at ACL.
> Leaderboard: [LINK](https://eval.ai/web/challenges/challenge-page/793)
### Languages
English
## Dataset Structure
### Data Instances
Sample data instance for `dialogue_domain` :
```
{
"dial_id": "9f44c1539efe6f7e79b02eb1b413aa43",
"doc_id": "Top 5 DMV Mistakes and How to Avoid Them#3_0",
"domain": "dmv",
"turns": [
{
"da": "query_condition",
"references": [
{
"sp_id": "4",
"label": "precondition"
}
],
"role": "user",
"turn_id": 1,
"utterance": "Hello, I forgot o update my address, can you help me with that?"
},
{
"da": "response_solution",
"references": [
{
"sp_id": "6",
"label": "solution"
},
{
"sp_id": "7",
"label": "solution"
},
{
"sp_id": "4",
"label": "references"
}
],
"role": "agent",
"turn_id": 2,
"utterance": "hi, you have to report any change of address to DMV within 10 days after moving. You should do this both for the address associated with your license and all the addresses associated with all your vehicles."
},
{
"da": "query_solution",
"references": [
{
"sp_id": "56",
"label": "solution"
},
{
"sp_id": "48",
"label": "references"
}
],
"role": "user",
"turn_id": 3,
"utterance": "Can I do my DMV transactions online?"
},
{
"da": "respond_solution",
"references": [
{
"sp_id": "56",
"label": "solution"
},
{
"sp_id": "48",
"label": "references"
}
],
"role": "agent",
"turn_id": 4,
"utterance": "Yes, you can sign up for MyDMV for all the online transactions needed."
},
{
"da": "query_condition",
"references": [
{
"sp_id": "48",
"label": "precondition"
}
],
"role": "user",
"turn_id": 5,
"utterance": "Thanks, and in case I forget to bring all of the documentation needed to the DMV office, what can I do?"
},
{
"da": "respond_solution",
"references": [
{
"sp_id": "49",
"label": "solution"
},
{
"sp_id": "50",
"label": "solution"
},
{
"sp_id": "52",
"label": "solution"
},
{
"sp_id": "48",
"label": "references"
}
],
"role": "agent",
"turn_id": 6,
"utterance": "This happens often with our customers so that's why our website and MyDMV are so useful for our customers. Just check if you can make your transaction online so you don't have to go to the DMV Office."
},
{
"da": "query_solution",
"references": [
{
"sp_id": "6",
"label": "solution"
},
{
"sp_id": "7",
"label": "solution"
},
{
"sp_id": "4",
"label": "references"
}
],
"role": "user",
"turn_id": 7,
"utterance": "Ok, and can you tell me again where should I report my new address?"
},
{
"da": "respond_solution",
"references": [
{
"sp_id": "6",
"label": "solution"
},
{
"sp_id": "7",
"label": "solution"
},
{
"sp_id": "4",
"label": "references"
}
],
"role": "agent",
"turn_id": 8,
"utterance": "Sure. Any change of address must be reported to the DMV, that's for the address associated with your license and any of your vehicles."
},
{
"da": "query_condition",
"references": [
{
"sp_id": "40",
"label": "precondition"
}
],
"role": "user",
"turn_id": 9,
"utterance": "Can you tell me more about Traffic points and their cost?"
},
{
"da": "respond_solution",
"references": [
{
"sp_id": "41",
"label": "solution"
},
{
"sp_id": "43",
"label": "solution"
},
{
"sp_id": "40",
"label": "references"
}
],
"role": "agent",
"turn_id": 10,
"utterance": "Traffic points is the system used by DMV to track dangerous drivers. The cost of the traffic points is independent of the DRA, so you get a separate charge based on the total points you accumulate."
}
]
}
```
Sample data instance for `document_domain` :
```
{
"doc_id": "Benefits Planner: Retirement | Online Calculator (WEP Version)#1_0",
"domain": "ssa",
"doc_html_raw": "<main class=\"content\" id=\"content\" role=\"main\">\n\n<section>\n\n<div>\n<h2>\nBenefits Planner: Retirement\n</h2>\n</div>\n</section>\n\n\n<section>\n\n<div>\n\n<div>\n\n\n</div>\n\n<article>\n<section>\n\n<h3>Online Calculator (WEP Version)</h3>\n<p>The calculator shown below allows you to estimate your Social Security benefit.\nHowever, for the most accurate estimates, <a>use the Detailed Calculator</a>.</p>\n<p>You need to enter all your past earnings\n, which are shown on your <a>online </a>.</p>\n\n<p>Please Note:</p>\n<ul class=\"browser-default\">\n<li>The Online Calculator is updated periodically<span>*</span> with new benefit increases and other benefit amounts. Therefore, it is likely that your benefit estimates in the future will differ from those calculated today.</li>\n<li>The Online Calculator works on PCs and Macs with Javascript enabled.</li>\n<li>Some browsers may not allow you to print the table below. </li>\n</ul>\n<p></p>\n\n<div>\nThe Online Calculator temporarily stores information on your local computer while your browser is open. To protect your personal information, you should close your browser after you have finished your estimate.\n</div>\n<p></p>\n\n<div>\n<p>Note: If your birthday is on January 1st, we figure your benefit as if your birthday was in the previous year.</p>\n<p>If you qualify for benefits as a Survivor, your <a>full retirement age for survivors benefits</a> may be different.</p></div>\n\n<div>\n</div></section></article></div></section></main>",
"doc_html_ts": "<main><section><div><h2 sent_id=\"1\" text_id=\"1\">Benefits Planner: Retirement</h2></div></section><section><div><article><section><h3 sent_id=\"2\" text_id=\"2\">Online Calculator (WEP Version)</h3><div tag_id=\"1\"><u sent_id=\"3\" tag_id=\"1\"><u sent_id=\"3\" tag_id=\"1\" text_id=\"3\">The calculator shown below allows you to estimate your Social Security benefit .</u></u><u sent_id=\"4\" tag_id=\"1\"><u sent_id=\"4\" tag_id=\"1\" text_id=\"4\">However ,</u><u sent_id=\"4\" tag_id=\"1\" text_id=\"5\">for the most accurate estimates ,</u><u sent_id=\"4\" tag_id=\"1\" text_id=\"6\">use the Detailed Calculator .</u></u></div><div tag_id=\"2\"><u sent_id=\"5\" tag_id=\"2\"><u sent_id=\"5\" tag_id=\"2\" text_id=\"7\">You need to enter all your past earnings , which are shown on your online .</u></u></div><div tag_id=\"3\"><u sent_id=\"6\" tag_id=\"3\"><u sent_id=\"6\" tag_id=\"3\" text_id=\"8\">Please Note:</u></u></div><ul class=\"browser-default\" tag_id=\"3\"><li tag_id=\"3\"><div tag_id=\"3\"><u sent_id=\"9\" tag_id=\"3\"><u sent_id=\"9\" tag_id=\"3\" text_id=\"9\">The Online Calculator is updated periodically * with new benefit increases and other benefit amounts .</u></u><u sent_id=\"10\" tag_id=\"3\"><u sent_id=\"10\" tag_id=\"3\" text_id=\"10\">Therefore ,</u><u sent_id=\"10\" tag_id=\"3\" text_id=\"11\">it is likely that your benefit estimates in the future will differ from those calculated today .</u></u></div></li><li tag_id=\"3\"><u sent_id=\"11\" tag_id=\"3\"><u sent_id=\"11\" tag_id=\"3\" text_id=\"12\">The Online Calculator works on PCs and Macs with Javascript enabled .</u></u></li><li tag_id=\"3\"><u sent_id=\"12\" tag_id=\"3\"><u sent_id=\"12\" tag_id=\"3\" text_id=\"13\">Some browsers may not allow you to print the table below .</u></u></li></ul><div>The Online Calculator temporarily stores information on your local computer while your browser is open. To protect your personal information, you should close your browser after you have finished your estimate.</div><div><div tag_id=\"4\"><u sent_id=\"13\" tag_id=\"4\"><u sent_id=\"13\" tag_id=\"4\" text_id=\"14\">Note:</u></u><u sent_id=\"14\" tag_id=\"4\"><u sent_id=\"14\" tag_id=\"4\" text_id=\"15\">If your birthday is on January 1st ,</u><u sent_id=\"14\" tag_id=\"4\" text_id=\"16\">we figure your benefit as if your birthday was in the previous year .</u></u></div><div tag_id=\"5\"><u sent_id=\"15\" tag_id=\"5\"><u sent_id=\"15\" tag_id=\"5\" text_id=\"17\">If you qualify for benefits as a Survivor ,</u><u sent_id=\"15\" tag_id=\"5\" text_id=\"18\">your full retirement age for survivors benefits may be different .</u></u></div></div></section></article></div></section></main>",
"doc_text": "\n\nBenefits Planner: Retirement \n\n\nOnline Calculator (WEP Version) \nThe calculator shown below allows you to estimate your Social Security benefit. However , for the most accurate estimates , use the Detailed Calculator. You need to enter all your past earnings, which are shown on your online. Please Note: The Online Calculator is updated periodically * with new benefit increases and other benefit amounts. Therefore , it is likely that your benefit estimates in the future will differ from those calculated today. The Online Calculator works on PCs and Macs with Javascript enabled. Some browsers may not allow you to print the table below. Note: If your birthday is on January 1st , we figure your benefit as if your birthday was in the previous year. If you qualify for benefits as a Survivor , your full retirement age for survivors benefits may be different. ",
"title": "Benefits Planner: Retirement | Online Calculator (WEP Version)#1",
"spans": [
{
"end_sec": 32,
"end_sp": 32,
"id_sec": "t_0",
"id_sp": "1",
"parent_titles": "[]",
"start_sec": 0,
"start_sp": 0,
"tag": "h2",
"text_sec": "\n\nBenefits Planner: Retirement \n",
"text_sp": "\n\nBenefits Planner: Retirement \n",
"title": "Benefits Planner: Retirement"
},
{
"end_sec": 67,
"end_sp": 67,
"id_sec": "t_1",
"id_sp": "2",
"parent_titles": "[{'id_sp': '1', 'text': 'Benefits Planner: Retirement', 'level': 'h2'}]",
"start_sec": 32,
"start_sp": 32,
"tag": "h3",
"text_sec": "\n\nOnline Calculator (WEP Version) \n",
"text_sp": "\n\nOnline Calculator (WEP Version) \n",
"title": "Online Calculator (WEP Version)"
},
{
"end_sec": 220,
"end_sp": 147,
"id_sec": "1",
"id_sp": "3",
"parent_titles": "[]",
"start_sec": 67,
"start_sp": 67,
"tag": "u",
"text_sec": "The calculator shown below allows you to estimate your Social Security benefit. However , for the most accurate estimates , use the Detailed Calculator. ",
"text_sp": "The calculator shown below allows you to estimate your Social Security benefit. ",
"title": "Online Calculator (WEP Version)"
}
]
}
```
Sample data instance for `doc2dial_rc` :
```
{
"id": "78f72b08b43791a4a70363fe62b8de08_1",
"is_impossible": false,
"question": "Hello, I want to know about the retirement plan.",
"answers": {
"answer_start": [
0
],
"text": [
"\n\nBenefits Planner: Retirement \n\n\nOnline Calculator (WEP Version) \n"
]
},
"context": "\n\nBenefits Planner: Retirement \n\n\nOnline Calculator (WEP Version) \nThe calculator shown below allows you to estimate your Social Security benefit. However , for the most accurate estimates , use the Detailed Calculator. You need to enter all your past earnings, which are shown on your online. Please Note: The Online Calculator is updated periodically * with new benefit increases and other benefit amounts. Therefore , it is likely that your benefit estimates in the future will differ from those calculated today. The Online Calculator works on PCs and Macs with Javascript enabled. Some browsers may not allow you to print the table below. Note: If your birthday is on January 1st , we figure your benefit as if your birthday was in the previous year. If you qualify for benefits as a Survivor , your full retirement age for survivors benefits may be different. ",
"title": "Benefits Planner: Retirement | Online Calculator (WEP Version)#1_0",
"domain": "ssa"
}
```
### Data Fields
For `document_domain`,
- `doc_id`: the ID of a document;
- `title`: the title of the document;
- `domain`: the domain of the document;
- `doc_text`: the text content of the document (without HTML markups);
- `doc_html_ts`: the document content with HTML markups and the annotated spans that are indicated by `text_id` attribute, which corresponds to `id_sp`.
- `doc_html_raw`: the document content with HTML markups and without span annotations.
- `spans`: key-value pairs of all spans in the document, with `id_sp` as key. Each span includes the following,
- `id_sp`: the id of a span as noted by `text_id` in `doc_html_ts`;
- `start_sp`/ `end_sp`: the start/end position of the text span in `doc_text`;
- `text_sp`: the text content of the span.
- `id_sec`: the id of the (sub)section (e.g. `<p>`) or title (`<h2>`) that contains the span.
- `start_sec` / `end_sec`: the start/end position of the (sub)section in `doc_text`.
- `text_sec`: the text of the (sub)section.
- `title`: the title of the (sub)section.
- `parent_titles`: the parent titles of the `title`.
For `dialogue_domain`:
- `dial_id`: the ID of a dialogue;
- `doc_id`: the ID of the associated document;
- `domain`: domain of the document;
- `turns`: a list of dialogue turns. Each turn includes,
- `turn_id`: the time order of the turn;
- `role`: either "agent" or "user";
- `da`: dialogue act;
- `references`: the grounding span (`id_sp`) in the associated document. If a turn is an irrelevant turn, i.e., `da` ends with "ood", `reference` is empty. **Note** that spans with labels "*precondition*"/"*solution*" are the actual grounding spans. Spans with label "*reference*" are the related titles or contextual reference, which is used for the purpose of describing a dialogue scene better to crowd contributors.
- `utterance`: the human-generated utterance based on the dialogue scene.
For `doc2dial_rc`, this conforms to [SQuAD](https://rajpurkar.github.io/SQuAD-explorer/) data format. For how to load Doc2Dial data for reading comprehension task, please refer [here](https://github.com/doc2dial/sharedtask-dialdoc2021).
- `id`: the ID of a QA instance;
- `question`: user query;
- `answers`: the answers that are grounded in the associated document;
- `answer_start`: the start position of the grounding span in the associated document (`context`);
- `text`: the text content of the grounding span;
- `title`: the title of the associated document;
- `domain`: the domain of the associated document;
- `context`: the text content of the associated document (without HTML markups).
### Data Splits
Training & dev split for dialogue domain
Training split only for document domain
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
Song Feng, Hui Wan, Chulaka Gunasekara, Siva Sankalp Patel,Sachindra Joshi. Luis A. Lastras
### Licensing Information
Creative Commons Attribution 3.0 Unported
### Citation Information
@inproceedings{feng-etal-2020-doc2dial,
title = "doc2dial: A Goal-Oriented Document-Grounded Dialogue Dataset",
author = "Feng, Song and Wan, Hui and Gunasekara, Chulaka and Patel, Siva and Joshi, Sachindra and Lastras, Luis",
booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)",
month = nov,
year = "2020",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.emnlp-main.652",
}
### Contributions
Thanks to [@songfeng](https://github.com/songfeng), [@KMFODA](https://github.com/KMFODA) for adding this dataset. |
thercyl/META | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: float64
- name: Ticker
dtype: string
- name: Year
dtype: string
- name: Text
dtype: string
- name: Embedding
dtype: string
splits:
- name: train
num_bytes: 86462290
num_examples: 2473
download_size: 48256052
dataset_size: 86462290
---
# Dataset Card for "META"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_Cyber_3.m1 | ---
pretty_name: Evaluation run of LeroyDyer/Mixtral_AI_Cyber_3.m1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [LeroyDyer/Mixtral_AI_Cyber_3.m1](https://huggingface.co/LeroyDyer/Mixtral_AI_Cyber_3.m1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_Cyber_3.m1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-30T19:36:53.650075](https://huggingface.co/datasets/open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_Cyber_3.m1/blob/main/results_2024-03-30T19-36-53.650075.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6509728530945673,\n\
\ \"acc_stderr\": 0.03208216820688878,\n \"acc_norm\": 0.6502876251907654,\n\
\ \"acc_norm_stderr\": 0.032754230179877435,\n \"mc1\": 0.6303549571603427,\n\
\ \"mc1_stderr\": 0.016898180706973878,\n \"mc2\": 0.77668890936802,\n\
\ \"mc2_stderr\": 0.013728436015980412\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7133105802047781,\n \"acc_stderr\": 0.013214986329274774,\n\
\ \"acc_norm\": 0.7406143344709898,\n \"acc_norm_stderr\": 0.012808273573927106\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7154949213304123,\n\
\ \"acc_stderr\": 0.004502563079349392,\n \"acc_norm\": 0.8895638319059949,\n\
\ \"acc_norm_stderr\": 0.0031279207383941086\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778398,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778398\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723292,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723292\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.024035489676335082,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.024035489676335082\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n\
\ \"acc_stderr\": 0.013778693778464085,\n \"acc_norm\": 0.8186462324393359,\n\
\ \"acc_norm_stderr\": 0.013778693778464085\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43798882681564244,\n\
\ \"acc_stderr\": 0.016593394227564843,\n \"acc_norm\": 0.43798882681564244,\n\
\ \"acc_norm_stderr\": 0.016593394227564843\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47392438070404175,\n\
\ \"acc_stderr\": 0.01275285834653313,\n \"acc_norm\": 0.47392438070404175,\n\
\ \"acc_norm_stderr\": 0.01275285834653313\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069443,\n \
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069443\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6303549571603427,\n\
\ \"mc1_stderr\": 0.016898180706973878,\n \"mc2\": 0.77668890936802,\n\
\ \"mc2_stderr\": 0.013728436015980412\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8500394632991318,\n \"acc_stderr\": 0.010034394804580809\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6937073540561031,\n \
\ \"acc_stderr\": 0.012696930106562912\n }\n}\n```"
repo_url: https://huggingface.co/LeroyDyer/Mixtral_AI_Cyber_3.m1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|arc:challenge|25_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|arc:challenge|25_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|gsm8k|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|gsm8k|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hellaswag|10_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hellaswag|10_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T19-30-17.121697.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T19-36-53.650075.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-30T19-36-53.650075.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- '**/details_harness|winogrande|5_2024-03-30T19-30-17.121697.parquet'
- split: 2024_03_30T19_36_53.650075
path:
- '**/details_harness|winogrande|5_2024-03-30T19-36-53.650075.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-30T19-36-53.650075.parquet'
- config_name: results
data_files:
- split: 2024_03_30T19_30_17.121697
path:
- results_2024-03-30T19-30-17.121697.parquet
- split: 2024_03_30T19_36_53.650075
path:
- results_2024-03-30T19-36-53.650075.parquet
- split: latest
path:
- results_2024-03-30T19-36-53.650075.parquet
---
# Dataset Card for Evaluation run of LeroyDyer/Mixtral_AI_Cyber_3.m1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [LeroyDyer/Mixtral_AI_Cyber_3.m1](https://huggingface.co/LeroyDyer/Mixtral_AI_Cyber_3.m1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_Cyber_3.m1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-30T19:36:53.650075](https://huggingface.co/datasets/open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_Cyber_3.m1/blob/main/results_2024-03-30T19-36-53.650075.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6509728530945673,
"acc_stderr": 0.03208216820688878,
"acc_norm": 0.6502876251907654,
"acc_norm_stderr": 0.032754230179877435,
"mc1": 0.6303549571603427,
"mc1_stderr": 0.016898180706973878,
"mc2": 0.77668890936802,
"mc2_stderr": 0.013728436015980412
},
"harness|arc:challenge|25": {
"acc": 0.7133105802047781,
"acc_stderr": 0.013214986329274774,
"acc_norm": 0.7406143344709898,
"acc_norm_stderr": 0.012808273573927106
},
"harness|hellaswag|10": {
"acc": 0.7154949213304123,
"acc_stderr": 0.004502563079349392,
"acc_norm": 0.8895638319059949,
"acc_norm_stderr": 0.0031279207383941086
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778398,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778398
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723292,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723292
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.024035489676335082,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.024035489676335082
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464085,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464085
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43798882681564244,
"acc_stderr": 0.016593394227564843,
"acc_norm": 0.43798882681564244,
"acc_norm_stderr": 0.016593394227564843
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47392438070404175,
"acc_stderr": 0.01275285834653313,
"acc_norm": 0.47392438070404175,
"acc_norm_stderr": 0.01275285834653313
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069443,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069443
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6303549571603427,
"mc1_stderr": 0.016898180706973878,
"mc2": 0.77668890936802,
"mc2_stderr": 0.013728436015980412
},
"harness|winogrande|5": {
"acc": 0.8500394632991318,
"acc_stderr": 0.010034394804580809
},
"harness|gsm8k|5": {
"acc": 0.6937073540561031,
"acc_stderr": 0.012696930106562912
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-65000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1097466
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
fuyu-quant/ibl-regression-ver4-linear | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: index
dtype: int64
- name: category
dtype: string
splits:
- name: train
num_bytes: 28379434
num_examples: 10000
- name: test
num_bytes: 2835063
num_examples: 1000
download_size: 18546957
dataset_size: 31214497
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
CyberHarem/jaye_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of jaye_arknights
This is the dataset of jaye_arknights, containing 59 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 59 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 146 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 59 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 59 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 59 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 59 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 59 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 146 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 146 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 146 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
lixik/ala | ---
license: openrail
---
|
AiresPucrs/adult-census-income | ---
dataset_info:
features:
- name: age
dtype: int64
- name: workclass
dtype: string
- name: fnlwgt
dtype: int64
- name: education
dtype: string
- name: education.num
dtype: int64
- name: marital.status
dtype: string
- name: occupation
dtype: string
- name: relationship
dtype: string
- name: race
dtype: string
- name: sex
dtype: string
- name: capital.gain
dtype: int64
- name: capital.loss
dtype: int64
- name: hours.per.week
dtype: int64
- name: native.country
dtype: string
- name: income
dtype: string
splits:
- name: train
num_bytes: 5316802
num_examples: 32561
download_size: 553790
dataset_size: 5316802
license: cc
language:
- en
pretty_name: adult-census-income
size_categories:
- 10K<n<100K
---
# adult-census-income
## Overview
The adult census income dataset is used for prediction tasks to determine whether a person makes over $50K a year.
It can also be used to explore biases in ML algorithms.
## Dataset Details
The original dataset, the [Adult Census Income](https://www.kaggle.com/datasets/uciml/adult-census-income), was created by Barry Becker from the 1994 Census database (USA),
to explore biases in ML algorithms. The prediction task of this dataset is to determine whether a person makes over 50K a year.
This data was extracted from the 1994 [Census Bureau database](https://www.census.gov/en.html) by Ronny Kohavi and Barry Becker (Data Mining and Visualization, Silicon Graphics). A set of reasonably clean records was extracted using the following conditions: ((AAGE>16) && (AGI>100) && (AFNLWGT>1) && (HRSWK>0)).
- Dataset Name: adult-census-income
- Language: English
- Total Size: 32,561 demonstrations
## Contents
The features and values that can be found in the adult census dataset are the following:
- **Income:** '>50K' (24,1%), '<=50K'(75,9%).
- **Age:** continuous.
- **Workclass:** Private, Self-emp-not-inc, Self-emp-inc, Federal-gov, Local-gov, State-gov, Without-pay, Never-worked.
- **fnlwgt:** continuous.
- **Education:** Bachelors, Some-college, 11th, HS-grad, Prof-school, Assoc-acdm, Assoc-voc, 9th, 7th-8th, 12th, Masters, 1st-4th, 10th, Doctorate, 5th-6th, Preschool.
- **Education.num:** continuous.
- **Marital.status:** Married-civ-spouse, Divorced, Never-married, Separated, Widowed, Married-spouse-absent, Married-AF-spouse.
- **Occupation:** Tech-support, Craft-repair, Other-service, Sales, Exec-managerial, Prof-specialty, Handlers-cleaners, Machine-op-inspect, Adm-clerical, Farming-fishing, Transport-moving, Priv-house-serv, Protective-serv, Armed-Forces.
- **Relationship:** Wife, Own-child, Husband, Not-in-family, Other-relative, Unmarried.
- **Race:** White, Asian-Pac-Islander, Amer-Indian--Eskimo, Other, Black.
- **Sex:** Female, Male.
- **Capital.gain:** continuous.
- **Capital.loss:** continuous.
- **Hours.per.week:** continuous.
- **Native.country:** United States, Cambodia, England, Puerto Rico, Canada, Germany, Outlying-US(Guam-USVI-etc), India, Japan, Greece, South, China, Cuba, Iran, Honduras,
Philippines, Italy, Poland, Jamaica, Vietnam, Mexico, Portugal, Ireland, France, Dominican Republic, Laos, Ecuador, Taiwan, Haiti, Columbia,
Hungary, Guatemala, Nicaragua, Scotland, Thailand, Yugoslavia, El-Salvador, Trinadad&Tobago, Peru, Hong, Holand-Netherlands.
## How to use
```python
from datasets import load_dataset
dataset = load_dataset("AiresPucrs/adult-census-income", split='train')
```
## License
This dataset is licensed under the [Creative Commons(CC)](https://creativecommons.org/publicdomain/zero/1.0/) License CC0 1.0.
|
diwank/slimorca-corrected-chatml | ---
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: token_count
dtype: int64
- name: judgement
dtype: string
- name: score
dtype: float64
- name: prompt
dtype: string
- name: response
dtype: string
- name: corrected_response
dtype: string
- name: chatml
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 2245334316
num_examples: 359295
download_size: 1103033253
dataset_size: 2245334316
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_dominguesm__canarim-7b | ---
pretty_name: Evaluation run of dominguesm/canarim-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [dominguesm/canarim-7b](https://huggingface.co/dominguesm/canarim-7b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dominguesm__canarim-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-25T04:39:02.146933](https://huggingface.co/datasets/open-llm-leaderboard/details_dominguesm__canarim-7b/blob/main/results_2024-01-25T04-39-02.146933.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4128450113919017,\n\
\ \"acc_stderr\": 0.0341595304064157,\n \"acc_norm\": 0.41725828050909847,\n\
\ \"acc_norm_stderr\": 0.03494988220974851,\n \"mc1\": 0.25091799265605874,\n\
\ \"mc1_stderr\": 0.015176985027707687,\n \"mc2\": 0.4002971290542134,\n\
\ \"mc2_stderr\": 0.013722080397364233\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4709897610921502,\n \"acc_stderr\": 0.01458677635529432,\n\
\ \"acc_norm\": 0.5196245733788396,\n \"acc_norm_stderr\": 0.01460013207594709\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5755825532762398,\n\
\ \"acc_stderr\": 0.0049324414796655305,\n \"acc_norm\": 0.7752439753037244,\n\
\ \"acc_norm_stderr\": 0.004165684625540424\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3618421052631579,\n \"acc_stderr\": 0.03910525752849724,\n\
\ \"acc_norm\": 0.3618421052631579,\n \"acc_norm_stderr\": 0.03910525752849724\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.45,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.3283018867924528,\n \"acc_stderr\": 0.028901593612411784,\n \
\ \"acc_norm\": 0.3283018867924528,\n \"acc_norm_stderr\": 0.028901593612411784\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3958333333333333,\n\
\ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.3958333333333333,\n\
\ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3236994219653179,\n\
\ \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.3236994219653179,\n\
\ \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.03793281185307809,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.03793281185307809\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39574468085106385,\n \"acc_stderr\": 0.03196758697835362,\n\
\ \"acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.03196758697835362\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n\
\ \"acc_stderr\": 0.03835153954399421,\n \"acc_norm\": 0.21052631578947367,\n\
\ \"acc_norm_stderr\": 0.03835153954399421\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.43448275862068964,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.43448275862068964,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24603174603174602,\n \"acc_stderr\": 0.02218203720294836,\n \"\
acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.02218203720294836\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.042163702135578345,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.042163702135578345\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.027869320571664632,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.027869320571664632\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2561576354679803,\n \"acc_stderr\": 0.0307127300709826,\n\
\ \"acc_norm\": 0.2561576354679803,\n \"acc_norm_stderr\": 0.0307127300709826\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5151515151515151,\n \"acc_stderr\": 0.03902551007374448,\n\
\ \"acc_norm\": 0.5151515151515151,\n \"acc_norm_stderr\": 0.03902551007374448\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3939393939393939,\n \"acc_stderr\": 0.03481285338232963,\n \"\
acc_norm\": 0.3939393939393939,\n \"acc_norm_stderr\": 0.03481285338232963\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5699481865284974,\n \"acc_stderr\": 0.03572954333144808,\n\
\ \"acc_norm\": 0.5699481865284974,\n \"acc_norm_stderr\": 0.03572954333144808\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.35384615384615387,\n \"acc_stderr\": 0.024243783994062164,\n\
\ \"acc_norm\": 0.35384615384615387,\n \"acc_norm_stderr\": 0.024243783994062164\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507384,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507384\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3487394957983193,\n \"acc_stderr\": 0.030956636328566548,\n\
\ \"acc_norm\": 0.3487394957983193,\n \"acc_norm_stderr\": 0.030956636328566548\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.4954128440366973,\n\
\ \"acc_stderr\": 0.021436420955529428,\n \"acc_norm\": 0.4954128440366973,\n\
\ \"acc_norm_stderr\": 0.021436420955529428\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.19907407407407407,\n \"acc_stderr\": 0.027232298462690218,\n\
\ \"acc_norm\": 0.19907407407407407,\n \"acc_norm_stderr\": 0.027232298462690218\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4362745098039216,\n \"acc_stderr\": 0.03480693138457038,\n \"\
acc_norm\": 0.4362745098039216,\n \"acc_norm_stderr\": 0.03480693138457038\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4641350210970464,\n \"acc_stderr\": 0.03246338898055659,\n \
\ \"acc_norm\": 0.4641350210970464,\n \"acc_norm_stderr\": 0.03246338898055659\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.484304932735426,\n\
\ \"acc_stderr\": 0.0335412657542081,\n \"acc_norm\": 0.484304932735426,\n\
\ \"acc_norm_stderr\": 0.0335412657542081\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4580152671755725,\n \"acc_stderr\": 0.04369802690578757,\n\
\ \"acc_norm\": 0.4580152671755725,\n \"acc_norm_stderr\": 0.04369802690578757\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6115702479338843,\n \"acc_stderr\": 0.044492703500683836,\n \"\
acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.044492703500683836\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4662576687116564,\n \"acc_stderr\": 0.039194155450484096,\n\
\ \"acc_norm\": 0.4662576687116564,\n \"acc_norm_stderr\": 0.039194155450484096\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.4174757281553398,\n \"acc_stderr\": 0.04882840548212238,\n\
\ \"acc_norm\": 0.4174757281553398,\n \"acc_norm_stderr\": 0.04882840548212238\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6282051282051282,\n\
\ \"acc_stderr\": 0.03166098891888079,\n \"acc_norm\": 0.6282051282051282,\n\
\ \"acc_norm_stderr\": 0.03166098891888079\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956914,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956914\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5708812260536399,\n\
\ \"acc_stderr\": 0.01769938848312678,\n \"acc_norm\": 0.5708812260536399,\n\
\ \"acc_norm_stderr\": 0.01769938848312678\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4624277456647399,\n \"acc_stderr\": 0.026842985519615375,\n\
\ \"acc_norm\": 0.4624277456647399,\n \"acc_norm_stderr\": 0.026842985519615375\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4084967320261438,\n \"acc_stderr\": 0.02814640599309636,\n\
\ \"acc_norm\": 0.4084967320261438,\n \"acc_norm_stderr\": 0.02814640599309636\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5241157556270096,\n\
\ \"acc_stderr\": 0.02836504154256457,\n \"acc_norm\": 0.5241157556270096,\n\
\ \"acc_norm_stderr\": 0.02836504154256457\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4691358024691358,\n \"acc_stderr\": 0.027767689606833935,\n\
\ \"acc_norm\": 0.4691358024691358,\n \"acc_norm_stderr\": 0.027767689606833935\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.30141843971631205,\n \"acc_stderr\": 0.02737412888263115,\n \
\ \"acc_norm\": 0.30141843971631205,\n \"acc_norm_stderr\": 0.02737412888263115\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3305084745762712,\n\
\ \"acc_stderr\": 0.012014142101842958,\n \"acc_norm\": 0.3305084745762712,\n\
\ \"acc_norm_stderr\": 0.012014142101842958\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.49264705882352944,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.49264705882352944,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4526143790849673,\n \"acc_stderr\": 0.02013679091849254,\n \
\ \"acc_norm\": 0.4526143790849673,\n \"acc_norm_stderr\": 0.02013679091849254\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\
\ \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n\
\ \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.40816326530612246,\n \"acc_stderr\": 0.03146465712827424,\n\
\ \"acc_norm\": 0.40816326530612246,\n \"acc_norm_stderr\": 0.03146465712827424\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5124378109452736,\n\
\ \"acc_stderr\": 0.03534439848539579,\n \"acc_norm\": 0.5124378109452736,\n\
\ \"acc_norm_stderr\": 0.03534439848539579\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3795180722891566,\n\
\ \"acc_stderr\": 0.03777798822748017,\n \"acc_norm\": 0.3795180722891566,\n\
\ \"acc_norm_stderr\": 0.03777798822748017\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6198830409356725,\n \"acc_stderr\": 0.037229657413855394,\n\
\ \"acc_norm\": 0.6198830409356725,\n \"acc_norm_stderr\": 0.037229657413855394\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25091799265605874,\n\
\ \"mc1_stderr\": 0.015176985027707687,\n \"mc2\": 0.4002971290542134,\n\
\ \"mc2_stderr\": 0.013722080397364233\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.01269653187003862\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09931766489764973,\n \
\ \"acc_stderr\": 0.008238371412683989\n }\n}\n```"
repo_url: https://huggingface.co/dominguesm/canarim-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|arc:challenge|25_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|gsm8k|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hellaswag|10_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T04-39-02.146933.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T04-39-02.146933.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- '**/details_harness|winogrande|5_2024-01-25T04-39-02.146933.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-25T04-39-02.146933.parquet'
- config_name: results
data_files:
- split: 2024_01_25T04_39_02.146933
path:
- results_2024-01-25T04-39-02.146933.parquet
- split: latest
path:
- results_2024-01-25T04-39-02.146933.parquet
---
# Dataset Card for Evaluation run of dominguesm/canarim-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [dominguesm/canarim-7b](https://huggingface.co/dominguesm/canarim-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dominguesm__canarim-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-25T04:39:02.146933](https://huggingface.co/datasets/open-llm-leaderboard/details_dominguesm__canarim-7b/blob/main/results_2024-01-25T04-39-02.146933.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4128450113919017,
"acc_stderr": 0.0341595304064157,
"acc_norm": 0.41725828050909847,
"acc_norm_stderr": 0.03494988220974851,
"mc1": 0.25091799265605874,
"mc1_stderr": 0.015176985027707687,
"mc2": 0.4002971290542134,
"mc2_stderr": 0.013722080397364233
},
"harness|arc:challenge|25": {
"acc": 0.4709897610921502,
"acc_stderr": 0.01458677635529432,
"acc_norm": 0.5196245733788396,
"acc_norm_stderr": 0.01460013207594709
},
"harness|hellaswag|10": {
"acc": 0.5755825532762398,
"acc_stderr": 0.0049324414796655305,
"acc_norm": 0.7752439753037244,
"acc_norm_stderr": 0.004165684625540424
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3618421052631579,
"acc_stderr": 0.03910525752849724,
"acc_norm": 0.3618421052631579,
"acc_norm_stderr": 0.03910525752849724
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3283018867924528,
"acc_stderr": 0.028901593612411784,
"acc_norm": 0.3283018867924528,
"acc_norm_stderr": 0.028901593612411784
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3958333333333333,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.3958333333333333,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3236994219653179,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.3236994219653179,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.03793281185307809,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.03793281185307809
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39574468085106385,
"acc_stderr": 0.03196758697835362,
"acc_norm": 0.39574468085106385,
"acc_norm_stderr": 0.03196758697835362
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.03835153954399421,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.03835153954399421
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.43448275862068964,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.43448275862068964,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.02218203720294836,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.02218203720294836
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.042163702135578345,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.042163702135578345
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4,
"acc_stderr": 0.027869320571664632,
"acc_norm": 0.4,
"acc_norm_stderr": 0.027869320571664632
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2561576354679803,
"acc_stderr": 0.0307127300709826,
"acc_norm": 0.2561576354679803,
"acc_norm_stderr": 0.0307127300709826
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5151515151515151,
"acc_stderr": 0.03902551007374448,
"acc_norm": 0.5151515151515151,
"acc_norm_stderr": 0.03902551007374448
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3939393939393939,
"acc_stderr": 0.03481285338232963,
"acc_norm": 0.3939393939393939,
"acc_norm_stderr": 0.03481285338232963
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5699481865284974,
"acc_stderr": 0.03572954333144808,
"acc_norm": 0.5699481865284974,
"acc_norm_stderr": 0.03572954333144808
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.35384615384615387,
"acc_stderr": 0.024243783994062164,
"acc_norm": 0.35384615384615387,
"acc_norm_stderr": 0.024243783994062164
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507384,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507384
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3487394957983193,
"acc_stderr": 0.030956636328566548,
"acc_norm": 0.3487394957983193,
"acc_norm_stderr": 0.030956636328566548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.4954128440366973,
"acc_stderr": 0.021436420955529428,
"acc_norm": 0.4954128440366973,
"acc_norm_stderr": 0.021436420955529428
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.19907407407407407,
"acc_stderr": 0.027232298462690218,
"acc_norm": 0.19907407407407407,
"acc_norm_stderr": 0.027232298462690218
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4362745098039216,
"acc_stderr": 0.03480693138457038,
"acc_norm": 0.4362745098039216,
"acc_norm_stderr": 0.03480693138457038
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4641350210970464,
"acc_stderr": 0.03246338898055659,
"acc_norm": 0.4641350210970464,
"acc_norm_stderr": 0.03246338898055659
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.484304932735426,
"acc_stderr": 0.0335412657542081,
"acc_norm": 0.484304932735426,
"acc_norm_stderr": 0.0335412657542081
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4580152671755725,
"acc_stderr": 0.04369802690578757,
"acc_norm": 0.4580152671755725,
"acc_norm_stderr": 0.04369802690578757
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.044492703500683836,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.044492703500683836
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4662576687116564,
"acc_stderr": 0.039194155450484096,
"acc_norm": 0.4662576687116564,
"acc_norm_stderr": 0.039194155450484096
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.4174757281553398,
"acc_stderr": 0.04882840548212238,
"acc_norm": 0.4174757281553398,
"acc_norm_stderr": 0.04882840548212238
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6282051282051282,
"acc_stderr": 0.03166098891888079,
"acc_norm": 0.6282051282051282,
"acc_norm_stderr": 0.03166098891888079
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956914,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956914
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5708812260536399,
"acc_stderr": 0.01769938848312678,
"acc_norm": 0.5708812260536399,
"acc_norm_stderr": 0.01769938848312678
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.026842985519615375,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.026842985519615375
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4084967320261438,
"acc_stderr": 0.02814640599309636,
"acc_norm": 0.4084967320261438,
"acc_norm_stderr": 0.02814640599309636
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5241157556270096,
"acc_stderr": 0.02836504154256457,
"acc_norm": 0.5241157556270096,
"acc_norm_stderr": 0.02836504154256457
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4691358024691358,
"acc_stderr": 0.027767689606833935,
"acc_norm": 0.4691358024691358,
"acc_norm_stderr": 0.027767689606833935
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.30141843971631205,
"acc_stderr": 0.02737412888263115,
"acc_norm": 0.30141843971631205,
"acc_norm_stderr": 0.02737412888263115
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3305084745762712,
"acc_stderr": 0.012014142101842958,
"acc_norm": 0.3305084745762712,
"acc_norm_stderr": 0.012014142101842958
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.49264705882352944,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.49264705882352944,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4526143790849673,
"acc_stderr": 0.02013679091849254,
"acc_norm": 0.4526143790849673,
"acc_norm_stderr": 0.02013679091849254
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.40816326530612246,
"acc_stderr": 0.03146465712827424,
"acc_norm": 0.40816326530612246,
"acc_norm_stderr": 0.03146465712827424
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5124378109452736,
"acc_stderr": 0.03534439848539579,
"acc_norm": 0.5124378109452736,
"acc_norm_stderr": 0.03534439848539579
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3795180722891566,
"acc_stderr": 0.03777798822748017,
"acc_norm": 0.3795180722891566,
"acc_norm_stderr": 0.03777798822748017
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6198830409356725,
"acc_stderr": 0.037229657413855394,
"acc_norm": 0.6198830409356725,
"acc_norm_stderr": 0.037229657413855394
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25091799265605874,
"mc1_stderr": 0.015176985027707687,
"mc2": 0.4002971290542134,
"mc2_stderr": 0.013722080397364233
},
"harness|winogrande|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.01269653187003862
},
"harness|gsm8k|5": {
"acc": 0.09931766489764973,
"acc_stderr": 0.008238371412683989
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
kkurihara5732/tmp-dataset-remake | ---
license: apache-2.0
---
|
vwxyzjn/cai-conversation-dev | ---
dataset_info:
features:
- name: index
dtype: int64
- name: prompt
dtype: string
- name: init_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: init_response
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: critic_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: critic_response
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: revision_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: revision_response
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train_sft
num_bytes: 9128
num_examples: 4
- name: train_prefs
num_bytes: 10733
num_examples: 4
- name: test_sft
num_bytes: 15069
num_examples: 4
- name: test_prefs
num_bytes: 11987
num_examples: 4
download_size: 126881
dataset_size: 46917
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: train_prefs
path: data/train_prefs-*
- split: test_sft
path: data/test_sft-*
- split: test_prefs
path: data/test_prefs-*
---
# Dataset Card for "cai-conversation-dev"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alexrs/alpaca-cleaned-15-clusters | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: input
dtype: string
- name: cluster
dtype: int32
splits:
- name: train
num_bytes: 40490946
num_examples: 51760
download_size: 24185910
dataset_size: 40490946
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "alpaca-cleaned-15-clusters"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
XingLiangLiang/AudreyHepburn | ---
task_categories:
- feature-extraction
pretty_name: AudreyHepburn
size_categories:
- 10M<n<100M
---
|
allganize/flare-convfinqa-multiturn-ko | ---
dataset_info:
features:
- name: conversation_id
dtype: string
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
- name: gpt
dtype: string
splits:
- name: test
num_bytes: 5027852
num_examples: 1152
download_size: 1442462
dataset_size: 5027852
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# flare-convfinqa-multiturn-ko
### 데이터 설명
- `flare-convfinqa-multiturn-ko` 데이터는 S&P 500에 상장된 기업의 실적 보고서에 관한 multi-turn QA 데이터셋입니다.
`flare-convfinqa-multiturn-ko` 데이터는 single-turn부터 multi-turn까지 다양한 구성의 대화를 포함합니다.
입력값으로는 text와 table이 함께 주어집니다.
- 한국어 데이터를 생성하기 위해, 먼저 사내 언어 번역 모델인 Allganize Translator를 활용하여 [ChanceFocus/flare-convfinqa](https://huggingface.co/datasets/ChanceFocus/flare-convfinqa)의 test set을 한국어로 번역하였습니다.
이후 동일한 단어가 반복되는 경우, 질문이 빈 값으로 생성된 경우, user prompt가 생략된 경우 등을 제거하였습니다.
또한, html 형식이 한글로 번역된 경우 이를 다시 html의 형태로 보정하였습니다.
마지막으로 데이터를 chat template의 형태로 변환하였으며, 그 결과 1152건의 데이터가 생성되었습니다.
### 데이터 출처
- [ChanceFocus/flare-convfinqa](https://huggingface.co/datasets/ChanceFocus/flare-convfinqa)
### 데이터 예시
```
{
'conversation_id': 'convfinqa3',
'conversations': array([
{
'content':
'이 일련의 상호 연결된 재무 관련 쿼리와 회사의 재무 서류에서 제공되는 전제, 표 데이터 및 사후 텍스트의 맥락에서 마지막 질문에 대한 답변을 제공하십시오.
이를 위해서는 문맥에서 정보를 추출하고 수학적 계산을 수행해야 할 수도 있습니다. 답변을 작성할 때 이전 질문과 답변에 제공된 정보를 고려하시기 바랍니다:\n
문맥: 스톡옵션 2013년 마라톤 계획에 따라 스톡옵션을 부여받은 주식은 2007년 계획에 따라 부여된 스톡옵션과 2003년 계획에 따라 이전에 부여된 스톡옵션입니다.
2019 년의 스톡옵션은 부여 된 날짜의 보통주 가치에 따라 보통주 구매권을 나타냅니다. 2004 년까지 특정 스톡옵션은 2003 년 계획에 따라 탠덤 주식 상승 권리와 함께 부여되었습니다.
2003 년 계획에 따라 결정된 보통주의 공정한 시장 가치에서 초과하는 옵션 가격 대신 현금 및/또는 보통주 를 수령 할 수 있도록 수혜자가 선택할 수 있습니다.
일반적으로 2007 년 계획 및 2003 년 계획에 따라 부여 된 스톡옵션은 3 년 동안 균등하게 부여되며 부여 된 날로부터 최대 10 년 동안 부여됩니다.
주식 상승 권리는 2005 년 이전에 2003 년 계획에 따라 부여 된 스톡옵션 . 2007 년 계획에 따라 부여된 주식 상승 권리는 없습니다.
스톡옵션과 유사하게 주식 상승 권리는 부여 가격보다 초과하는 보통주 가치에 해당하는 금액을 수령 할 수있는 권리를 나타냅니다.
2003 년 계획에 따라 특정 스톡옵션은 주식 상환 스톡옵션으로 부여되고 다른 스톡옵션은 스톡옵션과 함께 부여되었습니다.
일반적으로 2003 년 계획에 따라 부여 된 스톡옵션은 3 년 동안 균등하게 부여되며 부여 된 날로부터 최대 10 년 동안 부여됩니다.
주식 기반 성과 보상 2005 년 이전에 2003 년 계획에 따라 주식 기반 성과 보상을 부여 한 마라톤 . 2007 년 계획에 따라 주식 기반 성과 보상을 부여 한 적이 없습니다.
2005 년부터 마라톤은 주식 기반 성과 보상을 중단하고 대신 임원에게 현금 상환 성과 단위를 부여합니다. 2003 년 계획에 따라 부여 된 모든 주식 기반 성과 보상은 전부 또는 몰수되었습니다.
결과적으로 주식 기반 성과 보상이 미지급되었습니다. 제한 주식 2013 마라톤은 2007 년 계획에 따라 제한 주식 및 제한 주식 단위를 부여하고 이전에 2003 년 계획에 따라 제한 주식 및 제한 주식 단위를 부여했습니다.
2005 년에 보상 위원회는 연간 장기 인센티브 패키지의 일부로 마라톤 및 그 통합 자회사에있는 특정 미국 기반 임원에게 시간 기반 제한 주식을 부여하기 시작했습니다.
임원에게 부여 된 제한 주식은 부여 날짜로부터 3 년 동안 부여되며 수령자의 지속적인 고용에 따라 부여됩니다.
마라톤은 또한 특정 비 임원 직원에게 제한 주식을, 특정 국제 직원에게 제한 주식 단위 ( 201 제한 주식 보상 201D ) 를 부여하며, 특정 지침 내에서 성과에 따라 보유 목적으로 부여합니다.
비 임원에게 부여 된 제한 주식은 일반적으로 3 년 동안 1/3씩 균등하게 부여되며 수령자의 지속적인 고용에 따라 부여됩니다.
부여되기 전에 모든 제한 주식 수령자는 해당 주식을 투표하고 이에 대한 배당금을받을 권리가 있습니다. 비 부여 주식은 양도 할 수 없으며 마라톤 2019 년 양도 대리인이 보유하고 있습니다.
보통주 단위 2013 마라톤은 2007 년 계획에 따라 비 직원 이사에게 주식 보상 프로그램을 유지하고 이전에 2003 년 계획에 따라 주식 보상 프로그램을 유지했습니다.
의장을 제외한 모든 비 직원 이사는 연간 보통주 단위를 수령하며, 이사직을 떠날 때까지 해당 단위를 보유해야 합니다. 마라톤 보통주에 대해 배당금이 지급될 때 이사들은 추가 보통주 단위의 형태로 배당금에 상응하는 금액을 받습니다.
주식 기반 보상 비용 2013 총 직원 주식 기반 보상 비용은 2007 년, 2006 년 및 2005 년에 $ 8 천만, $ 8 천만 및 $ 1 억 1 천만이었습니다.
총 관련 소득세 혜택은 $ 2 천만, $ 3 천만 및 $ 3 천만이었습니다. 2007 년과 2006 년에는 스톡옵션 행사로 인한 현금 수령액이 $ 2 천만과 $ 5 천만이었습니다.
2007 년과 2006 년 동안 기록 된 스톡옵션 행사 및 기타 주식 기반 상여금에 대한 주식 기반 보상 비용 초과에 대한 세액 공제 기간 동안 실현 된 세액 공제 혜택은 총 $ 3 천만 및 $ 3 억 6 천만이었습니다.
주식 옵션 보상의 현금 지급은 2007 년과 2006 년 총 $ 1 백만 및 $ 3 백만이었습니다. 2007 년, 2006 년 및 2005 년에 부여 된 주식 보상 중 마라톤은 임원 및 비 임원 직원 모두에게 주식 보상을 부여했습니다.
이러한 보상에 대한 가중 평균 부여일 공정 가치 는 다음 블랙-숄즈 가정에 기반했습니다 : .
<table class="wikitable"> <tr><td>1</td><td>2</td><td>2007</td><td>2006</td><td>2005</td></tr><tr><td>3</td><td>예상 연간 주당 배당금</td><td>
$ 0.96</td><td>$ 0.80</td><td>$ 0.66</td></tr><tr><td>4</td><td>예상 수명(년)</td><td>5.0</td><td>5.1</td><td>5.5</td></tr><tr><td>5</td><td>
예상 변동성</td><td>27 % ( 27 % )</td><td>28 % ( 28 % )</td><td>28 % ( 28 % )</td></tr><tr><td>6</td><td>무위험 이자율</td><td>4.1 % ( 4.1 % )</td>
<td>5.0 % ( 5.0 % )</td><td>3.8 % ( 3.8 % )</td></tr><tr><td>7</td><td>부여 된 주식 보상의 가중 평균 부여 일 공정 가치</td><td>$ 17.24</td><td>
$ 10.19</td><td>$ 6.15</td></tr></table> .\n2007 년 주당 가중 평균 행사 가격은 얼마였습니까?',
'role': 'user'
},
{'content': '60.94', 'role': 'assistant'},
{'content': '그리고 2005 년에는 얼마였습니까?', 'role': 'user'},
{'content': '25.14', 'role': 'assistant'},
{'content': '그렇다면 수년에 걸친 변화는 무엇입니까?', 'role': 'user'},
{'content': '35.8', 'role': 'assistant'},
{'content': '2005 년 주당 가중 평균 행사 가격은 얼마였습니까?', 'role': 'user'}],
dtype=object)
'gpt': '25.14'
}
``` |
MegPaulson/SD_melanoma_carcinoma | ---
dataset_info:
features:
- name: image
dtype: image
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 158272068.0
num_examples: 619
download_size: 157959064
dataset_size: 158272068.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
EgilKarlsen/BGL_DistilRoBERTa_Baseline | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 115582709.0625
num_examples: 37500
- name: test
num_bytes: 38527570.0
num_examples: 12500
download_size: 211881627
dataset_size: 154110279.0625
---
# Dataset Card for "BGL_DistilRoBERTa_Baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tangjian234/tj1 | ---
license: openrail
---
|
unigram/fol-02 | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: string
- name: proof
dtype: string
- name: premise_tptp
dtype: string
- name: hypothesis_tptp
dtype: string
- name: deberta_pred
dtype: string
- name: deberta_pred_r1_label
dtype: string
- name: deberta_pred_r2_label
dtype: string
splits:
- name: train
num_bytes: 95533827
num_examples: 11109
- name: validation
num_bytes: 10997205
num_examples: 1356
- name: test
num_bytes: 11174472
num_examples: 1296
download_size: 20173620
dataset_size: 117705504
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
maneshkarun/hyperpartisan-cleaned | ---
license: mit
dataset_info:
features:
- name: text
dtype: string
- name: hyperpartisan
dtype: bool
- name: bias
dtype: int64
- name: cleaned_text
dtype: string
- name: word_count
dtype: int64
splits:
- name: train
num_bytes: 4355902524
num_examples: 600000
download_size: 2370814430
dataset_size: 4355902524
---
|
qillura/pusheen | ---
license: cc0-1.0
task_categories:
- text-to-image
language:
- en
pretty_name: Pusheen
size_categories:
- 1M<n<10M
--- |
Raymond-Moody/cs482-taxi | ---
language:
- en
pretty_name: cs482 Online Ensemble Learning
tags:
- finance
dataset_info:
features:
- name: key
dtype: string
- name: fare_amount
dtype: float64
- name: pickup_datetime
dtype: string
- name: pickup_longitude
dtype: float64
- name: pickup_latitude
dtype: float64
- name: dropoff_longitude
dtype: float64
- name: dropoff_latitude
dtype: float64
- name: passenger_count
dtype: int64
splits:
- name: train
num_bytes: 5321378239.727816
num_examples: 49881470
- name: test
num_bytes: 591264296.2721846
num_examples: 5542386
download_size: 3775655759
dataset_size: 5912642536.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
Created as part of my CS482 Course using data from the [New York City Taxi Fare Kaggle Competition](https://www.kaggle.com/competitions/new-york-city-taxi-fare-prediction/rules). |
ckiplab/traditional-chinese-dolly-15k | ---
license: cc-by-sa-3.0
---
|
open-llm-leaderboard/details_BarryFutureman__WestLakeX-7B-EvoMerge | ---
pretty_name: Evaluation run of BarryFutureman/WestLakeX-7B-EvoMerge
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BarryFutureman/WestLakeX-7B-EvoMerge](https://huggingface.co/BarryFutureman/WestLakeX-7B-EvoMerge)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BarryFutureman__WestLakeX-7B-EvoMerge\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-01T21:12:28.457963](https://huggingface.co/datasets/open-llm-leaderboard/details_BarryFutureman__WestLakeX-7B-EvoMerge/blob/main/results_2024-02-01T21-12-28.457963.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6539355408319422,\n\
\ \"acc_stderr\": 0.03205035581162719,\n \"acc_norm\": 0.6534261287046614,\n\
\ \"acc_norm_stderr\": 0.03272286633720597,\n \"mc1\": 0.5263157894736842,\n\
\ \"mc1_stderr\": 0.017479241161975457,\n \"mc2\": 0.674957392013114,\n\
\ \"mc2_stderr\": 0.014896008898157733\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6902730375426621,\n \"acc_stderr\": 0.013512058415238363,\n\
\ \"acc_norm\": 0.7141638225255973,\n \"acc_norm_stderr\": 0.013203196088537376\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6989643497311293,\n\
\ \"acc_stderr\": 0.00457770702503138,\n \"acc_norm\": 0.8808006373232424,\n\
\ \"acc_norm_stderr\": 0.0032336074238899773\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700914,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700914\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43915343915343913,\n \"acc_stderr\": 0.025559920550531003,\n \"\
acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.025559920550531003\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083522,\n \"\
acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083522\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \
\ \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37037037037037035,\n \"acc_stderr\": 0.029443169323031537,\n \
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.029443169323031537\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374307,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374307\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931038,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931038\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4201117318435754,\n\
\ \"acc_stderr\": 0.016507671073256402,\n \"acc_norm\": 0.4201117318435754,\n\
\ \"acc_norm_stderr\": 0.016507671073256402\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.02540383297817961,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.02540383297817961\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n\
\ \"acc_stderr\": 0.012741974333897226,\n \"acc_norm\": 0.4667535853976532,\n\
\ \"acc_norm_stderr\": 0.012741974333897226\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n\
\ \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421606,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421606\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5263157894736842,\n\
\ \"mc1_stderr\": 0.017479241161975457,\n \"mc2\": 0.674957392013114,\n\
\ \"mc2_stderr\": 0.014896008898157733\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8476716653512234,\n \"acc_stderr\": 0.010099208246065597\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6959818043972706,\n \
\ \"acc_stderr\": 0.01267042044019867\n }\n}\n```"
repo_url: https://huggingface.co/BarryFutureman/WestLakeX-7B-EvoMerge
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|arc:challenge|25_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|gsm8k|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hellaswag|10_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T21-12-28.457963.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T21-12-28.457963.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- '**/details_harness|winogrande|5_2024-02-01T21-12-28.457963.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-01T21-12-28.457963.parquet'
- config_name: results
data_files:
- split: 2024_02_01T21_12_28.457963
path:
- results_2024-02-01T21-12-28.457963.parquet
- split: latest
path:
- results_2024-02-01T21-12-28.457963.parquet
---
# Dataset Card for Evaluation run of BarryFutureman/WestLakeX-7B-EvoMerge
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BarryFutureman/WestLakeX-7B-EvoMerge](https://huggingface.co/BarryFutureman/WestLakeX-7B-EvoMerge) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BarryFutureman__WestLakeX-7B-EvoMerge",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-01T21:12:28.457963](https://huggingface.co/datasets/open-llm-leaderboard/details_BarryFutureman__WestLakeX-7B-EvoMerge/blob/main/results_2024-02-01T21-12-28.457963.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6539355408319422,
"acc_stderr": 0.03205035581162719,
"acc_norm": 0.6534261287046614,
"acc_norm_stderr": 0.03272286633720597,
"mc1": 0.5263157894736842,
"mc1_stderr": 0.017479241161975457,
"mc2": 0.674957392013114,
"mc2_stderr": 0.014896008898157733
},
"harness|arc:challenge|25": {
"acc": 0.6902730375426621,
"acc_stderr": 0.013512058415238363,
"acc_norm": 0.7141638225255973,
"acc_norm_stderr": 0.013203196088537376
},
"harness|hellaswag|10": {
"acc": 0.6989643497311293,
"acc_stderr": 0.00457770702503138,
"acc_norm": 0.8808006373232424,
"acc_norm_stderr": 0.0032336074238899773
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700914,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700914
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43915343915343913,
"acc_stderr": 0.025559920550531003,
"acc_norm": 0.43915343915343913,
"acc_norm_stderr": 0.025559920550531003
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.029443169323031537,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.029443169323031537
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374307,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374307
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931038,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931038
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069356,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069356
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4201117318435754,
"acc_stderr": 0.016507671073256402,
"acc_norm": 0.4201117318435754,
"acc_norm_stderr": 0.016507671073256402
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.02540383297817961,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.02540383297817961
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.012741974333897226,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.012741974333897226
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421606,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421606
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5263157894736842,
"mc1_stderr": 0.017479241161975457,
"mc2": 0.674957392013114,
"mc2_stderr": 0.014896008898157733
},
"harness|winogrande|5": {
"acc": 0.8476716653512234,
"acc_stderr": 0.010099208246065597
},
"harness|gsm8k|5": {
"acc": 0.6959818043972706,
"acc_stderr": 0.01267042044019867
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
tyzhu/squad_qa_no_id_v5_full_no_permute | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 7671973.074366877
num_examples: 4778
- name: validation
num_bytes: 342766
num_examples: 300
download_size: 1166862
dataset_size: 8014739.074366877
---
# Dataset Card for "squad_qa_no_id_v5_full_no_permute"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
St4rJr/dataset-japones | ---
license: openrail
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_101 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1346518296.0
num_examples: 264438
download_size: 1376379084
dataset_size: 1346518296.0
---
# Dataset Card for "chunk_101"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Roderich/2nd_testing | ---
license: other
---
|
open-llm-leaderboard/details_Kquant03__Eukaryote-8x7B-bf16 | ---
pretty_name: Evaluation run of Kquant03/Eukaryote-8x7B-bf16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kquant03/Eukaryote-8x7B-bf16](https://huggingface.co/Kquant03/Eukaryote-8x7B-bf16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kquant03__Eukaryote-8x7B-bf16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-17T08:12:21.184681](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Eukaryote-8x7B-bf16/blob/main/results_2024-01-17T08-12-21.184681.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6563228779594618,\n\
\ \"acc_stderr\": 0.03193320528683609,\n \"acc_norm\": 0.6559570918809283,\n\
\ \"acc_norm_stderr\": 0.032596696779968556,\n \"mc1\": 0.4638922888616891,\n\
\ \"mc1_stderr\": 0.017457800422268625,\n \"mc2\": 0.6316657951045663,\n\
\ \"mc2_stderr\": 0.01524980646948028\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.659556313993174,\n \"acc_stderr\": 0.013847460518892976,\n\
\ \"acc_norm\": 0.6945392491467577,\n \"acc_norm_stderr\": 0.01346008047800251\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6986656044612627,\n\
\ \"acc_stderr\": 0.004578999029127976,\n \"acc_norm\": 0.8729336785500896,\n\
\ \"acc_norm_stderr\": 0.0033236659644122007\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544057,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997692,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997692\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7838709677419354,\n \"acc_stderr\": 0.023415293433568525,\n \"\
acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.023415293433568525\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"\
acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.02959732973097809,\n \
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.02959732973097809\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457966,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n\
\ \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n\
\ \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069363,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069363\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45139664804469276,\n\
\ \"acc_stderr\": 0.01664330737231588,\n \"acc_norm\": 0.45139664804469276,\n\
\ \"acc_norm_stderr\": 0.01664330737231588\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958143,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958143\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n\
\ \"acc_stderr\": 0.01274724896707906,\n \"acc_norm\": 0.470013037809648,\n\
\ \"acc_norm_stderr\": 0.01274724896707906\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170595,\n\
\ \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170595\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827058,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827058\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4638922888616891,\n\
\ \"mc1_stderr\": 0.017457800422268625,\n \"mc2\": 0.6316657951045663,\n\
\ \"mc2_stderr\": 0.01524980646948028\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.823993685872139,\n \"acc_stderr\": 0.010703090882320705\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7194844579226687,\n \
\ \"acc_stderr\": 0.012374608490929556\n }\n}\n```"
repo_url: https://huggingface.co/Kquant03/Eukaryote-8x7B-bf16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|arc:challenge|25_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|gsm8k|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hellaswag|10_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T08-12-21.184681.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-17T08-12-21.184681.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- '**/details_harness|winogrande|5_2024-01-17T08-12-21.184681.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-17T08-12-21.184681.parquet'
- config_name: results
data_files:
- split: 2024_01_17T08_12_21.184681
path:
- results_2024-01-17T08-12-21.184681.parquet
- split: latest
path:
- results_2024-01-17T08-12-21.184681.parquet
---
# Dataset Card for Evaluation run of Kquant03/Eukaryote-8x7B-bf16
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kquant03/Eukaryote-8x7B-bf16](https://huggingface.co/Kquant03/Eukaryote-8x7B-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kquant03__Eukaryote-8x7B-bf16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-17T08:12:21.184681](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Eukaryote-8x7B-bf16/blob/main/results_2024-01-17T08-12-21.184681.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6563228779594618,
"acc_stderr": 0.03193320528683609,
"acc_norm": 0.6559570918809283,
"acc_norm_stderr": 0.032596696779968556,
"mc1": 0.4638922888616891,
"mc1_stderr": 0.017457800422268625,
"mc2": 0.6316657951045663,
"mc2_stderr": 0.01524980646948028
},
"harness|arc:challenge|25": {
"acc": 0.659556313993174,
"acc_stderr": 0.013847460518892976,
"acc_norm": 0.6945392491467577,
"acc_norm_stderr": 0.01346008047800251
},
"harness|hellaswag|10": {
"acc": 0.6986656044612627,
"acc_stderr": 0.004578999029127976,
"acc_norm": 0.8729336785500896,
"acc_norm_stderr": 0.0033236659644122007
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544057,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.049135952012744975,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.049135952012744975
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997692,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997692
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.02959732973097809,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.02959732973097809
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457966,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069363,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45139664804469276,
"acc_stderr": 0.01664330737231588,
"acc_norm": 0.45139664804469276,
"acc_norm_stderr": 0.01664330737231588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958143,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958143
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.470013037809648,
"acc_stderr": 0.01274724896707906,
"acc_norm": 0.470013037809648,
"acc_norm_stderr": 0.01274724896707906
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170595,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170595
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827058,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827058
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4638922888616891,
"mc1_stderr": 0.017457800422268625,
"mc2": 0.6316657951045663,
"mc2_stderr": 0.01524980646948028
},
"harness|winogrande|5": {
"acc": 0.823993685872139,
"acc_stderr": 0.010703090882320705
},
"harness|gsm8k|5": {
"acc": 0.7194844579226687,
"acc_stderr": 0.012374608490929556
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
martinGale/synthesized_corpus_v2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: transcript
dtype: string
splits:
- name: train
num_bytes: 1517270721
num_examples: 5052
- name: test
num_bytes: 166953520
num_examples: 562
download_size: 406579080
dataset_size: 1684224241
---
# Dataset Card for "synthesized_corpus_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GEM-submissions/ratishsp__seqplan__1646397329 | ---
benchmark: gem
type: prediction
submission_name: SeqPlan
tags:
- evaluation
- benchmark
---
# GEM Submission
Submission name: SeqPlan
|
Felladrin/ChatML-HelpSteer | ---
license: cc-by-4.0
language:
- en
size_categories:
- 10K<n<100K
task_categories:
- question-answering
- text-generation
---
[nvidia/HelpSteer](https://huggingface.co/datasets/nvidia/HelpSteer) in ChatML format, ready to use in [HuggingFace TRL's SFT Trainer](https://huggingface.co/docs/trl/main/en/sft_trainer).
Python code used for conversion:
```python
from datasets import load_dataset
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("Felladrin/Llama-160M-Chat-v1")
dataset = load_dataset("nvidia/HelpSteer", split="train")
def format(columns):
prompt = columns["prompt"].strip()
response = columns["response"].strip()
messages = [
{
"role": "user",
"content": prompt,
},
{
"role": "assistant",
"content": response,
},
]
return { "text": tokenizer.apply_chat_template(messages, tokenize=False) }
dataset.map(format).select_columns(['text', 'helpfulness', 'correctness', 'coherence', 'complexity', 'verbosity']).to_parquet("train.parquet")
```
|
ChanceFocus/fiqa-sentiment-classification | ---
language: en
license: mit
dataset_info:
features:
- name: _id
dtype: string
- name: sentence
dtype: string
- name: target
dtype: string
- name: aspect
dtype: string
- name: score
dtype: float64
- name: type
dtype: string
splits:
- name: train
num_bytes: 119567
num_examples: 822
- name: valid
num_bytes: 17184
num_examples: 117
- name: test
num_bytes: 33728
num_examples: 234
download_size: 102225
dataset_size: 170479
---
# Dataset Name
## Dataset Description
This dataset is based on the task 1 of the Financial Sentiment Analysis in the Wild (FiQA) challenge. It follows the same settings as described in the paper 'A Baseline for Aspect-Based Sentiment Analysis in Financial Microblogs and News'. The dataset is split into three subsets: train, valid, test with sizes 822, 117, 234 respectively.
## Dataset Structure
- `_id`: ID of the data point
- `sentence`: The sentence
- `target`: The target of the sentiment
- `aspect`: The aspect of the sentiment
- `score`: The sentiment score
- `type`: The type of the data point (headline or post)
## Additional Information
- Homepage: [FiQA Challenge](https://sites.google.com/view/fiqa/home)
- Citation: [A Baseline for Aspect-Based Sentiment Analysis in Financial Microblogs and News](https://arxiv.org/pdf/2211.00083.pdf)
## Downloading CSV
```python
from datasets import load_dataset
# Load the dataset from the hub
dataset = load_dataset("ChanceFocus/fiqa-sentiment-classification")
# Save the dataset to a CSV file
dataset["train"].to_csv("train.csv")
dataset["valid"].to_csv("valid.csv")
dataset["test"].to_csv("test.csv")
```
|
phosseini/multimodal_satire | ---
dataset_info:
features:
- name: url
dtype: string
- name: headline
dtype: string
- name: image_link
dtype: string
- name: is_satire
dtype: int32
splits:
- name: train
num_bytes: 2841764
num_examples: 10000
download_size: 1268537
dataset_size: 2841764
task_categories:
- image-classification
language:
- en
size_categories:
- 1K<n<10K
---
# Dataset card for "multimodal_satire"
This is the dataset for the paper [A Multi-Modal Method for Satire Detection using Textual and Visual Cues](https://aclanthology.org/2020.nlp4if-1.4/). To obtain the full-text body of the articles, you need to scrape websites using the provided links in the dataset.
* GitHub repository: [https://github.com/lilyli2004/satire](https://github.com/lilyli2004/satire)
## Reference
If you use this dataset, please cite the following paper:
```
@inproceedings{li-etal-2020-multi-modal,
title = "A Multi-Modal Method for Satire Detection using Textual and Visual Cues",
author = "Li, Lily and
Levi, Or and
Hosseini, Pedram and
Broniatowski, David",
booktitle = "Proceedings of the 3rd NLP4IF Workshop on NLP for Internet Freedom: Censorship, Disinformation, and Propaganda",
month = dec,
year = "2020",
address = "Barcelona, Spain (Online)",
publisher = "International Committee on Computational Linguistics (ICCL)",
url = "https://aclanthology.org/2020.nlp4if-1.4",
pages = "33--38",
abstract = "Satire is a form of humorous critique, but it is sometimes misinterpreted by readers as legitimate news, which can lead to harmful consequences. We observe that the images used in satirical news articles often contain absurd or ridiculous content and that image manipulation is used to create fictional scenarios. While previous work have studied text-based methods, in this work we propose a multi-modal approach based on state-of-the-art visiolinguistic model ViLBERT. To this end, we create a new dataset consisting of images and headlines of regular and satirical news for the task of satire detection. We fine-tune ViLBERT on the dataset and train a convolutional neural network that uses an image forensics technique. Evaluation on the dataset shows that our proposed multi-modal approach outperforms image-only, text-only, and simple fusion baselines.",
}
``` |
zaanind/qasimple | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 18378
num_examples: 25
download_size: 16549
dataset_size: 18378
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "qasimple"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.