datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
lavita/ChatDoctor-iCliniq | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: input
dtype: string
- name: answer_icliniq
dtype: string
- name: answer_chatgpt
dtype: string
- name: answer_chatdoctor
dtype: string
splits:
- name: train
num_bytes: 16962106
num_examples: 7321
download_size: 9373080
dataset_size: 16962106
---
# Dataset Card for "ChatDoctor-iCliniq"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/gov2_trec-tb-2006_efficiency_stream4 | ---
pretty_name: '`gov2/trec-tb-2006/efficiency/stream4`'
viewer: false
source_datasets: ['irds/gov2']
task_categories:
- text-retrieval
---
# Dataset Card for `gov2/trec-tb-2006/efficiency/stream4`
The `gov2/trec-tb-2006/efficiency/stream4` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/gov2#gov2/trec-tb-2006/efficiency/stream4).
# Data
This dataset provides:
- `queries` (i.e., topics); count=25,000
- For `docs`, use [`irds/gov2`](https://huggingface.co/datasets/irds/gov2)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/gov2_trec-tb-2006_efficiency_stream4', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Buttcher2006TrecTerabyte,
title={The TREC 2006 Terabyte Track},
author={Stefan B\"uttcher and Charles L. A. Clarke and Ian Soboroff},
booktitle={TREC},
year={2006}
}
```
|
helloelwin/helpful | ---
dataset_info:
features:
- name: question
dtype: string
- name: label
dtype: string
- name: context
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 90405314
num_examples: 43769
- name: test
num_bytes: 4870358
num_examples: 2351
download_size: 53137110
dataset_size: 95275672
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
joweyel/munzels | ---
license: unknown
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 3167640.0
num_examples: 20
download_size: 3168629
dataset_size: 3167640.0
---
|
MITCriticalData/SAT2_dataset_10_best_cities | ---
license: mit
---
|
gimmaru/ag_news | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': World
'1': Sports
'2': Business
'3': Sci/Tech
splits:
- name: test
num_bytes: 250023
num_examples: 1000
download_size: 0
dataset_size: 250023
---
# Dataset Card for "ag_news"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Note: This dataset was utilized for the evaluation of probability-based prompt selection techniques in the paper '[Improving Probability-based Prompt Selection Through Unified Evaluation and Analysis](https://arxiv.org/abs/2305.14877)'. It differs from the actual benchmark dataset. |
sonicgame/llama2_dataset_test_01 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 524
num_examples: 3
download_size: 2157
dataset_size: 524
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pnadel/perseuslatin_UD | ---
dataset_info:
features:
- name: fname
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 920071
num_examples: 4936
download_size: 342519
dataset_size: 920071
---
# Dataset Card for "perseuslatin_UD"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
betogaunt/senta | ---
license: openrail
---
|
GalaktischeGurke/full_dataset_1509_lines_invoice_contract_mail_GPT3.5_test | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 64372
num_examples: 29
download_size: 41790
dataset_size: 64372
---
# Dataset Card for "full_dataset_1509_lines_invoice_contract_mail_GPT3.5_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Yixian-Lu/NER_sasb_esg | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 248505
num_examples: 250
- name: validation
num_bytes: 10425
num_examples: 14
- name: test
num_bytes: 21395
num_examples: 29
download_size: 64757
dataset_size: 280325
---
# Dataset Card for "NER_sasb_esg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Avena83/ChhZ8YUCm8B5gW | ---
license: unknown
---
|
yottatechports/Custom-Dataset | ---
license: mit
dataset_info:
features:
- name: Question
dtype: string
- name: Answer
dtype: string
- name: Content
dtype: string
splits:
- name: train
num_bytes: 71490
num_examples: 31
- name: validation
num_bytes: 9190
num_examples: 5
- name: test
num_bytes: 9840
num_examples: 3
download_size: 81438
dataset_size: 90520
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
tyzhu/find_second_sent_train_200_eval_40_recite | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 719759
num_examples: 440
- name: validation
num_bytes: 71562
num_examples: 40
download_size: 326351
dataset_size: 791321
---
# Dataset Card for "find_second_sent_train_200_eval_40_recite"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
matlok/python-audio-copilot-training-using-class-knowledge-graphs-2024-01-27 | ---
license:
- other
pretty_name: >-
python copilot audio training using class with knowledge graphs collected on 2024-01-27
dataset_info:
- config_name: v1_train_transformers_src_and_pytorch
splits:
- name: v1_train_transformers_src_and_pytorch
- config_name: v2_train_text_generation_inference
splits:
- name: v2_train_text_generation_inference
- config_name: v3_pytorch_distributed_fsdp
splits:
- name: v3_pytorch_distributed_fsdp
- config_name: view_schema
splits:
- name: view_schema
configs:
- config_name: v1_train_transformers_src_and_pytorch
data_files:
- split: v1_train_transformers_src_and_pytorch
path: train/train_0001_transformers_src_and_pytorch.parquet
- config_name: v2_train_text_generation_inference
data_files:
- split: v2_train_text_generation_inference
path: train/train_0002_text_generation_inference.parquet
- config_name: v3_pytorch_distributed_fsdp
data_files:
- split: v3_pytorch_distributed_fsdp
path: train/train_0003_pytorch_fsdp.parquet
- config_name: view_schema
data_files:
- split: view_schema
path: files/lok-python-copilot-audio.class-v1_00000717.parquet
size_categories:
- 100K<n<1M
tags:
- python-copilot
- python-coding
- python-architecture
- knowledge-graphs
- multimodal
- text-image-audio
- fine-tuning
- training
- question-answering
- image-knowledge-graph
- alpaca
- mp3
- png
- text
- instruct
- class
- classes
# supported task_categories
# text-classification, token-classification, table-question-answering, question-answering, zero-shot-classification, translation, summarization, conversational, feature-extraction, text-generation, text2text-generation, fill-mask, sentence-similarity, text-to-speech, text-to-audio, automatic-speech-recognition, audio-to-audio, audio-classification, voice-activity-detection, depth-estimation, image-classification, object-detection, image-segmentation, text-to-image, image-to-text, image-to-image, image-to-video, unconditional-image-generation, video-classification, reinforcement-learning, robotics, tabular-classification, tabular-regression, tabular-to-text, table-to-text, multiple-choice, text-retrieval, time-series-forecasting, text-to-video, visual-question-answering, document-question-answering, zero-shot-image-classification, graph-ml, mask-generation, zero-shot-object-detection, text-to-3d, image-to-3d, other
task_categories:
- text-to-audio
- audio-to-audio
- question-answering
# supported task_ids
# acceptability-classification, entity-linking-classification, fact-checking, intent-classification, language-identification, multi-class-classification, multi-label-classification, multi-input-text-classification, natural-language-inference, semantic-similarity-classification, sentiment-classification, topic-classification, semantic-similarity-scoring, sentiment-scoring, sentiment-analysis, hate-speech-detection, text-scoring, named-entity-recognition, part-of-speech, parsing, lemmatization, word-sense-disambiguation, coreference-resolution, extractive-qa, open-domain-qa, closed-domain-qa, news-articles-summarization, news-articles-headline-generation, dialogue-generation, dialogue-modeling, language-modeling, text-simplification, explanation-generation, abstractive-qa, open-domain-abstractive-qa, closed-domain-qa, open-book-qa, closed-book-qa, slot-filling, masked-language-modeling, keyword-spotting, speaker-identification, audio-intent-classification, audio-emotion-recognition, audio-language-identification, multi-label-image-classification, multi-class-image-classification, face-detection, vehicle-detection, instance-segmentation, semantic-segmentation, panoptic-segmentation, image-captioning, image-inpainting, image-colorization, super-resolution, grasping, task-planning, tabular-multi-class-classification, tabular-multi-label-classification, tabular-single-column-regression, rdf-to-text, multiple-choice-qa, multiple-choice-coreference-resolution, document-retrieval, utterance-retrieval, entity-linking-retrieval, fact-checking-retrieval, univariate-time-series-forecasting, multivariate-time-series-forecasting, visual-question-answering, document-question-answering
task_ids:
- parsing
---
## Python Copilot Audio Training using Class with Knowledge Graphs
This dataset is a subset of the matlok python copilot datasets. Please refer to the [Multimodal Python Copilot Training Overview](https://huggingface.co/datasets/matlok/multimodal-python-copilot-training-overview) for more details on how to use this dataset.
### Details
Each class method has a question and answer mp3 where one voice reads the question and another voice reads the answer. Both mp3s are stored in the parquet **dbytes** column and the associated source code **file_path** identifier.
- Rows: 135496
- Size: 284.6 GB
- Data type: mp3
- Format: narrated alpaca question and answer pairs using two voices
### Schema
```
{
"audio_path": "string",
"audio_type": "string",
"dbytes": "string",
"dbytes_len": "int64",
"file_path": "string",
"file_path_len": "int64",
"lang": "string",
"lang_len": "int64",
"recsize": "int64"
}
```
### How to use the dataset
```python
from datasets import load_dataset
ds = load_dataset("matlok/python-audio-copilot-training-using-class-knowledge-graphs-2024-01-27", data_dir="files")
```
|
mazkobot/libri360_1s_augmented | ---
dataset_info:
features:
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 40276728937.024
num_examples: 1256744
download_size: 38107431571
dataset_size: 40276728937.024
---
# Dataset Card for "libri360_1s_augmented"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_KnutJaegersberg__galactica-orca-wizardlm-1.3b | ---
pretty_name: Evaluation run of KnutJaegersberg/galactica-orca-wizardlm-1.3b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KnutJaegersberg/galactica-orca-wizardlm-1.3b](https://huggingface.co/KnutJaegersberg/galactica-orca-wizardlm-1.3b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__galactica-orca-wizardlm-1.3b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-27T10:33:13.422388](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__galactica-orca-wizardlm-1.3b/blob/main/results_2023-07-27T10%3A33%3A13.422388.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2607167828475717,\n\
\ \"acc_stderr\": 0.031890976655530354,\n \"acc_norm\": 0.2619942427603414,\n\
\ \"acc_norm_stderr\": 0.03190140774792886,\n \"mc1\": 0.25458996328029376,\n\
\ \"mc1_stderr\": 0.015250117079156494,\n \"mc2\": 0.41268090479367026,\n\
\ \"mc2_stderr\": 0.015430187488430076\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.27303754266211605,\n \"acc_stderr\": 0.013019332762635727,\n\
\ \"acc_norm\": 0.30887372013651876,\n \"acc_norm_stderr\": 0.013501770929344004\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.32065325632344155,\n\
\ \"acc_stderr\": 0.004657738398900916,\n \"acc_norm\": 0.3601872137024497,\n\
\ \"acc_norm_stderr\": 0.004790734683704583\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2814814814814815,\n\
\ \"acc_stderr\": 0.038850042458002526,\n \"acc_norm\": 0.2814814814814815,\n\
\ \"acc_norm_stderr\": 0.038850042458002526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.24342105263157895,\n \"acc_stderr\": 0.034923496688842384,\n\
\ \"acc_norm\": 0.24342105263157895,\n \"acc_norm_stderr\": 0.034923496688842384\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.31,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.26037735849056604,\n \"acc_stderr\": 0.027008766090708104,\n\
\ \"acc_norm\": 0.26037735849056604,\n \"acc_norm_stderr\": 0.027008766090708104\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.19,\n\
\ \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2774566473988439,\n\
\ \"acc_stderr\": 0.03414014007044036,\n \"acc_norm\": 0.2774566473988439,\n\
\ \"acc_norm_stderr\": 0.03414014007044036\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102963,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102963\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3310344827586207,\n \"acc_stderr\": 0.03921545312467122,\n\
\ \"acc_norm\": 0.3310344827586207,\n \"acc_norm_stderr\": 0.03921545312467122\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2830687830687831,\n \"acc_stderr\": 0.023201392938194974,\n \"\
acc_norm\": 0.2830687830687831,\n \"acc_norm_stderr\": 0.023201392938194974\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n\
\ \"acc_stderr\": 0.035122074123020534,\n \"acc_norm\": 0.19047619047619047,\n\
\ \"acc_norm_stderr\": 0.035122074123020534\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2161290322580645,\n\
\ \"acc_stderr\": 0.02341529343356852,\n \"acc_norm\": 0.2161290322580645,\n\
\ \"acc_norm_stderr\": 0.02341529343356852\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.23645320197044334,\n \"acc_stderr\": 0.029896114291733552,\n\
\ \"acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.029896114291733552\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.03287666758603489,\n\
\ \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.03287666758603489\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.20707070707070707,\n \"acc_stderr\": 0.028869778460267063,\n \"\
acc_norm\": 0.20707070707070707,\n \"acc_norm_stderr\": 0.028869778460267063\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22279792746113988,\n \"acc_stderr\": 0.03003114797764154,\n\
\ \"acc_norm\": 0.22279792746113988,\n \"acc_norm_stderr\": 0.03003114797764154\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.31794871794871793,\n \"acc_stderr\": 0.023610884308927865,\n\
\ \"acc_norm\": 0.31794871794871793,\n \"acc_norm_stderr\": 0.023610884308927865\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230182,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230182\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.027553614467863804,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.027553614467863804\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.24503311258278146,\n \"acc_stderr\": 0.03511807571804724,\n \"\
acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.03511807571804724\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.25137614678899084,\n \"acc_stderr\": 0.018599206360287415,\n \"\
acc_norm\": 0.25137614678899084,\n \"acc_norm_stderr\": 0.018599206360287415\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.30092592592592593,\n \"acc_stderr\": 0.03128039084329883,\n \"\
acc_norm\": 0.30092592592592593,\n \"acc_norm_stderr\": 0.03128039084329883\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604246,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604246\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.32286995515695066,\n\
\ \"acc_stderr\": 0.03138147637575498,\n \"acc_norm\": 0.32286995515695066,\n\
\ \"acc_norm_stderr\": 0.03138147637575498\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.183206106870229,\n \"acc_stderr\": 0.033927709264947335,\n\
\ \"acc_norm\": 0.183206106870229,\n \"acc_norm_stderr\": 0.033927709264947335\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.03941897526516301,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.03941897526516301\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2147239263803681,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.2147239263803681,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.04157751539865629,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.04157751539865629\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.32038834951456313,\n \"acc_stderr\": 0.0462028408228004,\n\
\ \"acc_norm\": 0.32038834951456313,\n \"acc_norm_stderr\": 0.0462028408228004\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.02934311479809447,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.02934311479809447\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24393358876117496,\n\
\ \"acc_stderr\": 0.015357212665829468,\n \"acc_norm\": 0.24393358876117496,\n\
\ \"acc_norm_stderr\": 0.015357212665829468\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.30057803468208094,\n \"acc_stderr\": 0.02468531686725781,\n\
\ \"acc_norm\": 0.30057803468208094,\n \"acc_norm_stderr\": 0.02468531686725781\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23687150837988827,\n\
\ \"acc_stderr\": 0.014219570788103982,\n \"acc_norm\": 0.23687150837988827,\n\
\ \"acc_norm_stderr\": 0.014219570788103982\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.024739981355113596,\n\
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.024739981355113596\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2057877813504823,\n\
\ \"acc_stderr\": 0.022961339906764244,\n \"acc_norm\": 0.2057877813504823,\n\
\ \"acc_norm_stderr\": 0.022961339906764244\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2839506172839506,\n \"acc_stderr\": 0.025089478523765127,\n\
\ \"acc_norm\": 0.2839506172839506,\n \"acc_norm_stderr\": 0.025089478523765127\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.026011992930902,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.026011992930902\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24902216427640156,\n\
\ \"acc_stderr\": 0.01104489226404077,\n \"acc_norm\": 0.24902216427640156,\n\
\ \"acc_norm_stderr\": 0.01104489226404077\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2757352941176471,\n \"acc_stderr\": 0.027146271936625166,\n\
\ \"acc_norm\": 0.2757352941176471,\n \"acc_norm_stderr\": 0.027146271936625166\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.27941176470588236,\n \"acc_stderr\": 0.01815287105153882,\n \
\ \"acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.01815287105153882\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.23636363636363636,\n\
\ \"acc_stderr\": 0.04069306319721376,\n \"acc_norm\": 0.23636363636363636,\n\
\ \"acc_norm_stderr\": 0.04069306319721376\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.19591836734693877,\n \"acc_stderr\": 0.025409301953225678,\n\
\ \"acc_norm\": 0.19591836734693877,\n \"acc_norm_stderr\": 0.025409301953225678\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.263681592039801,\n\
\ \"acc_stderr\": 0.031157150869355582,\n \"acc_norm\": 0.263681592039801,\n\
\ \"acc_norm_stderr\": 0.031157150869355582\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n\
\ \"acc_stderr\": 0.03507295431370519,\n \"acc_norm\": 0.28313253012048195,\n\
\ \"acc_norm_stderr\": 0.03507295431370519\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.30409356725146197,\n \"acc_stderr\": 0.03528211258245232,\n\
\ \"acc_norm\": 0.30409356725146197,\n \"acc_norm_stderr\": 0.03528211258245232\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25458996328029376,\n\
\ \"mc1_stderr\": 0.015250117079156494,\n \"mc2\": 0.41268090479367026,\n\
\ \"mc2_stderr\": 0.015430187488430076\n }\n}\n```"
repo_url: https://huggingface.co/KnutJaegersberg/galactica-orca-wizardlm-1.3b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|arc:challenge|25_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hellaswag|10_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-27T10:33:13.422388.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-27T10:33:13.422388.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-27T10:33:13.422388.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-27T10:33:13.422388.parquet'
- config_name: results
data_files:
- split: 2023_07_27T10_33_13.422388
path:
- results_2023-07-27T10:33:13.422388.parquet
- split: latest
path:
- results_2023-07-27T10:33:13.422388.parquet
---
# Dataset Card for Evaluation run of KnutJaegersberg/galactica-orca-wizardlm-1.3b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KnutJaegersberg/galactica-orca-wizardlm-1.3b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KnutJaegersberg/galactica-orca-wizardlm-1.3b](https://huggingface.co/KnutJaegersberg/galactica-orca-wizardlm-1.3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__galactica-orca-wizardlm-1.3b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-27T10:33:13.422388](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__galactica-orca-wizardlm-1.3b/blob/main/results_2023-07-27T10%3A33%3A13.422388.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2607167828475717,
"acc_stderr": 0.031890976655530354,
"acc_norm": 0.2619942427603414,
"acc_norm_stderr": 0.03190140774792886,
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156494,
"mc2": 0.41268090479367026,
"mc2_stderr": 0.015430187488430076
},
"harness|arc:challenge|25": {
"acc": 0.27303754266211605,
"acc_stderr": 0.013019332762635727,
"acc_norm": 0.30887372013651876,
"acc_norm_stderr": 0.013501770929344004
},
"harness|hellaswag|10": {
"acc": 0.32065325632344155,
"acc_stderr": 0.004657738398900916,
"acc_norm": 0.3601872137024497,
"acc_norm_stderr": 0.004790734683704583
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.038850042458002526,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.038850042458002526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.24342105263157895,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.24342105263157895,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.26037735849056604,
"acc_stderr": 0.027008766090708104,
"acc_norm": 0.26037735849056604,
"acc_norm_stderr": 0.027008766090708104
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.17,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.17,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2774566473988439,
"acc_stderr": 0.03414014007044036,
"acc_norm": 0.2774566473988439,
"acc_norm_stderr": 0.03414014007044036
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102963,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102963
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3310344827586207,
"acc_stderr": 0.03921545312467122,
"acc_norm": 0.3310344827586207,
"acc_norm_stderr": 0.03921545312467122
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2830687830687831,
"acc_stderr": 0.023201392938194974,
"acc_norm": 0.2830687830687831,
"acc_norm_stderr": 0.023201392938194974
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.035122074123020534,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.035122074123020534
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2161290322580645,
"acc_stderr": 0.02341529343356852,
"acc_norm": 0.2161290322580645,
"acc_norm_stderr": 0.02341529343356852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.029896114291733552,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.029896114291733552
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.03287666758603489,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.03287666758603489
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20707070707070707,
"acc_stderr": 0.028869778460267063,
"acc_norm": 0.20707070707070707,
"acc_norm_stderr": 0.028869778460267063
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22279792746113988,
"acc_stderr": 0.03003114797764154,
"acc_norm": 0.22279792746113988,
"acc_norm_stderr": 0.03003114797764154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.31794871794871793,
"acc_stderr": 0.023610884308927865,
"acc_norm": 0.31794871794871793,
"acc_norm_stderr": 0.023610884308927865
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230182,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230182
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.027553614467863804,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.027553614467863804
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.03511807571804724,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.03511807571804724
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.25137614678899084,
"acc_stderr": 0.018599206360287415,
"acc_norm": 0.25137614678899084,
"acc_norm_stderr": 0.018599206360287415
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.30092592592592593,
"acc_stderr": 0.03128039084329883,
"acc_norm": 0.30092592592592593,
"acc_norm_stderr": 0.03128039084329883
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.32286995515695066,
"acc_stderr": 0.03138147637575498,
"acc_norm": 0.32286995515695066,
"acc_norm_stderr": 0.03138147637575498
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.183206106870229,
"acc_stderr": 0.033927709264947335,
"acc_norm": 0.183206106870229,
"acc_norm_stderr": 0.033927709264947335
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.03941897526516301,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.03941897526516301
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2147239263803681,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.2147239263803681,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.04157751539865629,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.04157751539865629
},
"harness|hendrycksTest-management|5": {
"acc": 0.32038834951456313,
"acc_stderr": 0.0462028408228004,
"acc_norm": 0.32038834951456313,
"acc_norm_stderr": 0.0462028408228004
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02934311479809447,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02934311479809447
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24393358876117496,
"acc_stderr": 0.015357212665829468,
"acc_norm": 0.24393358876117496,
"acc_norm_stderr": 0.015357212665829468
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.30057803468208094,
"acc_stderr": 0.02468531686725781,
"acc_norm": 0.30057803468208094,
"acc_norm_stderr": 0.02468531686725781
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23687150837988827,
"acc_stderr": 0.014219570788103982,
"acc_norm": 0.23687150837988827,
"acc_norm_stderr": 0.014219570788103982
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.024739981355113596,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.024739981355113596
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2057877813504823,
"acc_stderr": 0.022961339906764244,
"acc_norm": 0.2057877813504823,
"acc_norm_stderr": 0.022961339906764244
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2839506172839506,
"acc_stderr": 0.025089478523765127,
"acc_norm": 0.2839506172839506,
"acc_norm_stderr": 0.025089478523765127
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.026011992930902,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.026011992930902
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24902216427640156,
"acc_stderr": 0.01104489226404077,
"acc_norm": 0.24902216427640156,
"acc_norm_stderr": 0.01104489226404077
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2757352941176471,
"acc_stderr": 0.027146271936625166,
"acc_norm": 0.2757352941176471,
"acc_norm_stderr": 0.027146271936625166
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.27941176470588236,
"acc_stderr": 0.01815287105153882,
"acc_norm": 0.27941176470588236,
"acc_norm_stderr": 0.01815287105153882
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.04069306319721376,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.04069306319721376
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19591836734693877,
"acc_stderr": 0.025409301953225678,
"acc_norm": 0.19591836734693877,
"acc_norm_stderr": 0.025409301953225678
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.263681592039801,
"acc_stderr": 0.031157150869355582,
"acc_norm": 0.263681592039801,
"acc_norm_stderr": 0.031157150869355582
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370519,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370519
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.30409356725146197,
"acc_stderr": 0.03528211258245232,
"acc_norm": 0.30409356725146197,
"acc_norm_stderr": 0.03528211258245232
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156494,
"mc2": 0.41268090479367026,
"mc2_stderr": 0.015430187488430076
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mask-distilled-one-sec-cv12/chunk_182 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 724973500
num_examples: 142375
download_size: 736968984
dataset_size: 724973500
---
# Dataset Card for "chunk_182"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ChanceFocus/flare-sm-acl | ---
dataset_info:
features:
- name: id
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
- name: choices
sequence: string
- name: gold
dtype: int64
splits:
- name: train
num_bytes: 70385369
num_examples: 20781
- name: valid
num_bytes: 9049127
num_examples: 2555
- name: test
num_bytes: 13359338
num_examples: 3720
download_size: 46311736
dataset_size: 92793834
---
# Dataset Card for "flare-sm-acl"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anan-2024/twitter_dataset_1713184552 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 85569
num_examples: 220
download_size: 50951
dataset_size: 85569
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ravithejads/ms_marco_hi_mr_te_ta | ---
dataset_info:
features:
- name: answers
sequence: string
- name: passages
sequence:
- name: is_selected
dtype: int32
- name: passage_text
dtype: string
- name: url
dtype: string
- name: query
dtype: string
- name: query_id
dtype: int32
- name: query_type
dtype: string
- name: wellFormedAnswers
sequence: string
- name: query_hi
dtype: string
- name: answers_hi
dtype: string
- name: passage_text_hi
sequence: string
- name: query_mr
dtype: string
- name: passage_text_mr
sequence: string
- name: answers_mr
sequence: string
- name: query_te
dtype: string
- name: passage_text_te
sequence: string
- name: answers_te
sequence: string
- name: query_ta
dtype: string
- name: passage_text_ta
sequence: string
- name: answers_ta
sequence: string
splits:
- name: test
num_bytes: 417420965
num_examples: 9650
download_size: 141214459
dataset_size: 417420965
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
lapp0/hotpot_query_expansion_synthetic_cleaned | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4908240
num_examples: 25593
- name: eval
num_bytes: 264342
num_examples: 1359
download_size: 3390694
dataset_size: 5172582
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: eval
path: data/eval-*
---
|
genta-tech/boolq-id | ---
dataset_info:
features:
- name: question
dtype: string
- name: passage
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 4300375
num_examples: 9427
download_size: 2503993
dataset_size: 4300375
license: cc-by-sa-4.0
task_categories:
- text-classification
- feature-extraction
language:
- id
tags:
- super_glue
- text similarity
size_categories:
- 10K<n<100K
---
# Dataset Card for "boolq-id"
This dataset is a translated version of qnli dataset from [super_glue](https://huggingface.co/datasets/super_glue) dataset.
# Citing & Authors
```
@inproceedings{clark2019boolq,
title={BoolQ: Exploring the Surprising Difficulty of Natural Yes/No Questions},
author={Clark, Christopher and Lee, Kenton and Chang, Ming-Wei, and Kwiatkowski, Tom and Collins, Michael, and Toutanova, Kristina},
booktitle={NAACL},
year={2019}
}
@article{wang2019superglue,
title={SuperGLUE: A Stickier Benchmark for General-Purpose Language Understanding Systems},
author={Wang, Alex and Pruksachatkun, Yada and Nangia, Nikita and Singh, Amanpreet and Michael, Julian and Hill, Felix and Levy, Omer and Bowman, Samuel R},
journal={arXiv preprint arXiv:1905.00537},
year={2019}
}
``` |
MicPie/unpredictable_cluster28 | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: UnpredicTable-cluster28
size_categories:
- 100K<n<1M
source_datasets: []
task_categories:
- multiple-choice
- question-answering
- zero-shot-classification
- text2text-generation
- table-question-answering
- text-generation
- text-classification
- tabular-classification
task_ids:
- multiple-choice-qa
- extractive-qa
- open-domain-qa
- closed-domain-qa
- closed-book-qa
- open-book-qa
- language-modeling
- multi-class-classification
- natural-language-inference
- topic-classification
- multi-label-classification
- tabular-multi-class-classification
- tabular-multi-label-classification
---
# Dataset Card for "UnpredicTable-cluster28" - Dataset of Few-shot Tasks from Tables
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** https://ethanperez.net/unpredictable
- **Repository:** https://github.com/JunShern/few-shot-adaptation
- **Paper:** Few-shot Adaptation Works with UnpredicTable Data
- **Point of Contact:** junshern@nyu.edu, perez@nyu.edu
### Dataset Summary
The UnpredicTable dataset consists of web tables formatted as few-shot tasks for fine-tuning language models to improve their few-shot performance.
There are several dataset versions available:
* [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full): Starting from the initial WTC corpus of 50M tables, we apply our tables-to-tasks procedure to produce our resulting dataset, [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full), which comprises 413,299 tasks from 23,744 unique websites.
* [UnpredicTable-unique](https://huggingface.co/datasets/MicPie/unpredictable_unique): This is the same as [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full) but filtered to have a maximum of one task per website. [UnpredicTable-unique](https://huggingface.co/datasets/MicPie/unpredictable_unique) contains exactly 23,744 tasks from 23,744 websites.
* [UnpredicTable-5k](https://huggingface.co/datasets/MicPie/unpredictable_5k): This dataset contains 5k random tables from the full dataset.
* UnpredicTable data subsets based on a manual human quality rating (please see our publication for details of the ratings):
* [UnpredicTable-rated-low](https://huggingface.co/datasets/MicPie/unpredictable_rated-low)
* [UnpredicTable-rated-medium](https://huggingface.co/datasets/MicPie/unpredictable_rated-medium)
* [UnpredicTable-rated-high](https://huggingface.co/datasets/MicPie/unpredictable_rated-high)
* UnpredicTable data subsets based on the website of origin:
* [UnpredicTable-baseball-fantasysports-yahoo-com](https://huggingface.co/datasets/MicPie/unpredictable_baseball-fantasysports-yahoo-com)
* [UnpredicTable-bulbapedia-bulbagarden-net](https://huggingface.co/datasets/MicPie/unpredictable_bulbapedia-bulbagarden-net)
* [UnpredicTable-cappex-com](https://huggingface.co/datasets/MicPie/unpredictable_cappex-com)
* [UnpredicTable-cram-com](https://huggingface.co/datasets/MicPie/unpredictable_cram-com)
* [UnpredicTable-dividend-com](https://huggingface.co/datasets/MicPie/unpredictable_dividend-com)
* [UnpredicTable-dummies-com](https://huggingface.co/datasets/MicPie/unpredictable_dummies-com)
* [UnpredicTable-en-wikipedia-org](https://huggingface.co/datasets/MicPie/unpredictable_en-wikipedia-org)
* [UnpredicTable-ensembl-org](https://huggingface.co/datasets/MicPie/unpredictable_ensembl-org)
* [UnpredicTable-gamefaqs-com](https://huggingface.co/datasets/MicPie/unpredictable_gamefaqs-com)
* [UnpredicTable-mgoblog-com](https://huggingface.co/datasets/MicPie/unpredictable_mgoblog-com)
* [UnpredicTable-mmo-champion-com](https://huggingface.co/datasets/MicPie/unpredictable_mmo-champion-com)
* [UnpredicTable-msdn-microsoft-com](https://huggingface.co/datasets/MicPie/unpredictable_msdn-microsoft-com)
* [UnpredicTable-phonearena-com](https://huggingface.co/datasets/MicPie/unpredictable_phonearena-com)
* [UnpredicTable-sittercity-com](https://huggingface.co/datasets/MicPie/unpredictable_sittercity-com)
* [UnpredicTable-sporcle-com](https://huggingface.co/datasets/MicPie/unpredictable_sporcle-com)
* [UnpredicTable-studystack-com](https://huggingface.co/datasets/MicPie/unpredictable_studystack-com)
* [UnpredicTable-support-google-com](https://huggingface.co/datasets/MicPie/unpredictable_support-google-com)
* [UnpredicTable-w3-org](https://huggingface.co/datasets/MicPie/unpredictable_w3-org)
* [UnpredicTable-wiki-openmoko-org](https://huggingface.co/datasets/MicPie/unpredictable_wiki-openmoko-org)
* [UnpredicTable-wkdu-org](https://huggingface.co/datasets/MicPie/unpredictable_wkdu-org)
* UnpredicTable data subsets based on clustering (for the clustering details please see our publication):
* [UnpredicTable-cluster00](https://huggingface.co/datasets/MicPie/unpredictable_cluster00)
* [UnpredicTable-cluster01](https://huggingface.co/datasets/MicPie/unpredictable_cluster01)
* [UnpredicTable-cluster02](https://huggingface.co/datasets/MicPie/unpredictable_cluster02)
* [UnpredicTable-cluster03](https://huggingface.co/datasets/MicPie/unpredictable_cluster03)
* [UnpredicTable-cluster04](https://huggingface.co/datasets/MicPie/unpredictable_cluster04)
* [UnpredicTable-cluster05](https://huggingface.co/datasets/MicPie/unpredictable_cluster05)
* [UnpredicTable-cluster06](https://huggingface.co/datasets/MicPie/unpredictable_cluster06)
* [UnpredicTable-cluster07](https://huggingface.co/datasets/MicPie/unpredictable_cluster07)
* [UnpredicTable-cluster08](https://huggingface.co/datasets/MicPie/unpredictable_cluster08)
* [UnpredicTable-cluster09](https://huggingface.co/datasets/MicPie/unpredictable_cluster09)
* [UnpredicTable-cluster10](https://huggingface.co/datasets/MicPie/unpredictable_cluster10)
* [UnpredicTable-cluster11](https://huggingface.co/datasets/MicPie/unpredictable_cluster11)
* [UnpredicTable-cluster12](https://huggingface.co/datasets/MicPie/unpredictable_cluster12)
* [UnpredicTable-cluster13](https://huggingface.co/datasets/MicPie/unpredictable_cluster13)
* [UnpredicTable-cluster14](https://huggingface.co/datasets/MicPie/unpredictable_cluster14)
* [UnpredicTable-cluster15](https://huggingface.co/datasets/MicPie/unpredictable_cluster15)
* [UnpredicTable-cluster16](https://huggingface.co/datasets/MicPie/unpredictable_cluster16)
* [UnpredicTable-cluster17](https://huggingface.co/datasets/MicPie/unpredictable_cluster17)
* [UnpredicTable-cluster18](https://huggingface.co/datasets/MicPie/unpredictable_cluster18)
* [UnpredicTable-cluster19](https://huggingface.co/datasets/MicPie/unpredictable_cluster19)
* [UnpredicTable-cluster20](https://huggingface.co/datasets/MicPie/unpredictable_cluster20)
* [UnpredicTable-cluster21](https://huggingface.co/datasets/MicPie/unpredictable_cluster21)
* [UnpredicTable-cluster22](https://huggingface.co/datasets/MicPie/unpredictable_cluster22)
* [UnpredicTable-cluster23](https://huggingface.co/datasets/MicPie/unpredictable_cluster23)
* [UnpredicTable-cluster24](https://huggingface.co/datasets/MicPie/unpredictable_cluster24)
* [UnpredicTable-cluster25](https://huggingface.co/datasets/MicPie/unpredictable_cluster25)
* [UnpredicTable-cluster26](https://huggingface.co/datasets/MicPie/unpredictable_cluster26)
* [UnpredicTable-cluster27](https://huggingface.co/datasets/MicPie/unpredictable_cluster27)
* [UnpredicTable-cluster28](https://huggingface.co/datasets/MicPie/unpredictable_cluster28)
* [UnpredicTable-cluster29](https://huggingface.co/datasets/MicPie/unpredictable_cluster29)
* [UnpredicTable-cluster-noise](https://huggingface.co/datasets/MicPie/unpredictable_cluster-noise)
### Supported Tasks and Leaderboards
Since the tables come from the web, the distribution of tasks and topics is very broad. The shape of our dataset is very wide, i.e., we have 1000's of tasks, while each task has only a few examples, compared to most current NLP datasets which are very deep, i.e., 10s of tasks with many examples. This implies that our dataset covers a broad range of potential tasks, e.g., multiple-choice, question-answering, table-question-answering, text-classification, etc.
The intended use of this dataset is to improve few-shot performance by fine-tuning/pre-training on our dataset.
### Languages
English
## Dataset Structure
### Data Instances
Each task is represented as a jsonline file and consists of several few-shot examples. Each example is a dictionary containing a field 'task', which identifies the task, followed by an 'input', 'options', and 'output' field. The 'input' field contains several column elements of the same row in the table, while the 'output' field is a target which represents an individual column of the same row. Each task contains several such examples which can be concatenated as a few-shot task. In the case of multiple choice classification, the 'options' field contains the possible classes that a model needs to choose from.
There are also additional meta-data fields such as 'pageTitle', 'title', 'outputColName', 'url', 'wdcFile'.
### Data Fields
'task': task identifier
'input': column elements of a specific row in the table.
'options': for multiple choice classification, it provides the options to choose from.
'output': target column element of the same row as input.
'pageTitle': the title of the page containing the table.
'outputColName': output column name
'url': url to the website containing the table
'wdcFile': WDC Web Table Corpus file
### Data Splits
The UnpredicTable datasets do not come with additional data splits.
## Dataset Creation
### Curation Rationale
Few-shot training on multi-task datasets has been demonstrated to improve language models' few-shot learning (FSL) performance on new tasks, but it is unclear which training tasks lead to effective downstream task adaptation. Few-shot learning datasets are typically produced with expensive human curation, limiting the scale and diversity of the training tasks available to study. As an alternative source of few-shot data, we automatically extract 413,299 tasks from diverse internet tables. We provide this as a research resource to investigate the relationship between training data and few-shot learning.
### Source Data
#### Initial Data Collection and Normalization
We use internet tables from the English-language Relational Subset of the WDC Web Table Corpus 2015 (WTC). The WTC dataset tables were extracted from the July 2015 Common Crawl web corpus (http://webdatacommons.org/webtables/2015/EnglishStatistics.html). The dataset contains 50,820,165 tables from 323,160 web domains. We then convert the tables into few-shot learning tasks. Please see our publication for more details on the data collection and conversion pipeline.
#### Who are the source language producers?
The dataset is extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/).
### Annotations
#### Annotation process
Manual annotation was only carried out for the [UnpredicTable-rated-low](https://huggingface.co/datasets/MicPie/unpredictable_rated-low),
[UnpredicTable-rated-medium](https://huggingface.co/datasets/MicPie/unpredictable_rated-medium), and [UnpredicTable-rated-high](https://huggingface.co/datasets/MicPie/unpredictable_rated-high) data subsets to rate task quality. Detailed instructions of the annotation instructions can be found in our publication.
#### Who are the annotators?
Annotations were carried out by a lab assistant.
### Personal and Sensitive Information
The data was extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/), which in turn extracted tables from the [Common Crawl](https://commoncrawl.org/). We did not filter the data in any way. Thus any user identities or otherwise sensitive information (e.g., data that reveals racial or ethnic origins, sexual orientations, religious beliefs, political opinions or union memberships, or locations; financial or health data; biometric or genetic data; forms of government identification, such as social security numbers; criminal history, etc.) might be contained in our dataset.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is intended for use as a research resource to investigate the relationship between training data and few-shot learning. As such, it contains high- and low-quality data, as well as diverse content that may be untruthful or inappropriate. Without careful investigation, it should not be used for training models that will be deployed for use in decision-critical or user-facing situations.
### Discussion of Biases
Since our dataset contains tables that are scraped from the web, it will also contain many toxic, racist, sexist, and otherwise harmful biases and texts. We have not run any analysis on the biases prevalent in our datasets. Neither have we explicitly filtered the content. This implies that a model trained on our dataset may potentially reflect harmful biases and toxic text that exist in our dataset.
### Other Known Limitations
No additional known limitations.
## Additional Information
### Dataset Curators
Jun Shern Chan, Michael Pieler, Jonathan Jao, Jérémy Scheurer, Ethan Perez
### Licensing Information
Apache 2.0
### Citation Information
```
@misc{chan2022few,
author = {Chan, Jun Shern and Pieler, Michael and Jao, Jonathan and Scheurer, Jérémy and Perez, Ethan},
title = {Few-shot Adaptation Works with UnpredicTable Data},
publisher={arXiv},
year = {2022},
url = {https://arxiv.org/abs/2208.01009}
}
```
|
cmu-mlsp/encodec_24khz-librispeech_asr-validation.clean-features | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 24000
- name: text
dtype: string
- name: speaker_id
dtype: int64
- name: chapter_id
dtype: int64
- name: id
dtype: string
- name: audio_codes
sequence:
sequence: int64
splits:
- name: validation.clean
num_bytes: 955281891.0
num_examples: 2703
download_size: 915219050
dataset_size: 955281891.0
configs:
- config_name: default
data_files:
- split: validation.clean
path: data/validation.clean-*
---
# Dataset Card for "encodec_24khz-librispeech_asr-validation.clean-features"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lavita/medical-qa-shared-task-v1-half | ---
dataset_info:
features:
- name: id
dtype: int64
- name: ending0
dtype: string
- name: ending1
dtype: string
- name: ending2
dtype: string
- name: ending3
dtype: string
- name: ending4
dtype: string
- name: label
dtype: int64
- name: sent1
dtype: string
- name: sent2
dtype: string
- name: startphrase
dtype: string
splits:
- name: train
num_bytes: 8345963.0
num_examples: 5089
- name: dev
num_bytes: 1043251.5
num_examples: 636
download_size: 5270199
dataset_size: 9389214.5
---
# Dataset Card for "medical-qa-shared-task-v1-half"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gagan3012/history | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Noto_Sans_Arabic
'1': Readex_Pro
'2': Amiri
'3': Noto_Kufi_Arabic
'4': Reem_Kufi_Fun
'5': Lateef
'6': Changa
'7': Kufam
'8': ElMessiri
'9': Reem_Kufi
'10': Noto_Naskh_Arabic
'11': Reem_Kufi_Ink
'12': Tajawal
'13': Aref_Ruqaa_Ink
'14': Markazi_Text
'15': IBM_Plex_Sans_Arabic
'16': Vazirmatn
'17': Harmattan
'18': Gulzar
'19': Scheherazade_New
'20': Cairo
'21': Amiri_Quran
'22': Noto_Nastaliq_Urdu
'23': Mada
'24': Aref_Ruqaa
'25': Almarai
'26': Alkalami
'27': Qahiri
- name: text
dtype: string
splits:
- name: train
num_bytes: 1305288004.198
num_examples: 1498
download_size: 1305299663
dataset_size: 1305288004.198
---
# Dataset Card for "history"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/1e0bfadc | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 178
num_examples: 10
download_size: 1331
dataset_size: 178
---
# Dataset Card for "1e0bfadc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-astronomy | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: fewshot_context_neg
dtype: string
splits:
- name: dev
num_bytes: 5110
num_examples: 5
- name: test
num_bytes: 764857
num_examples: 152
download_size: 95332
dataset_size: 769967
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu-astronomy"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JetBrains-Research/lca-codegen-small | ---
dataset_info:
features:
- name: repo
dtype: string
- name: commit_hash
dtype: string
- name: completion_file
struct:
- name: filename
dtype: string
- name: content
dtype: string
- name: completion_lines
struct:
- name: infile
sequence: int32
- name: inproject
sequence: int32
- name: common
sequence: int32
- name: commited
sequence: int32
- name: non_informative
sequence: int32
- name: random
sequence: int32
- name: repo_snapshot
sequence:
- name: filename
dtype: string
- name: content
dtype: string
- name: completion_lines_raw
struct:
- name: commited
sequence: int64
- name: common
sequence: int64
- name: infile
sequence: int64
- name: inproject
sequence: int64
- name: non_informative
sequence: int64
- name: other
sequence: int64
splits:
- name: test
num_bytes: 111010036
num_examples: 144
download_size: 37603701
dataset_size: 111010036
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# LCA Project Level Code Completion
## How to load the dataset
```
from datasets import load_dataset
ds = load_dataset('JetBrains-Research/lca-codegen-small', split='test')
```
## Data Point Structure
* `repo` -- repository name in format `{GitHub_user_name}__{repository_name}`
* `commit_hash` -- commit hash
* `completion_file` -- dictionary with the completion file content in the following format:
* `filename` -- filepath to the completion file
* `content` -- content of the completion file
* `completion_lines` -- dictionary where keys are classes of lines and values are a list of integers (numbers of lines to complete). The classes are:
* `committed` -- line contains at least one function or class that was declared in the committed files
* `inproject` -- line contains at least one function and class that was declared in the project (excluding previous)
* `infile` -- line contains at least one function and class that was declared in the completion file (excluding previous)
* `common` -- line contains at least one function and class that was classified to be common, e.g. `main`, `get`, etc (excluding previous)
* `non_informative` -- line that was classified to be non-informative, e.g. too short, contains comments, etc
* `random` -- randomly sampled from the rest of the lines
* `repo_snapshot` -- dictionary with a snapshot of the repository before the commit. Has the same structure as `completion_file`, but filenames and contents are orginized as lists.
* `completion_lines_raw` -- the same as `completion_lines`, but before sampling.
## How we collected the data
* TBA
|
vinisebk/jo | ---
license: openrail
---
|
cognitivecomputations/open-instruct-uncensored | ---
license: apache-2.0
---
This is [Allen AI's open-instruct dataset.](https://github.com/allenai/open-instruct)
It is used to train the Tulu family of models.
- https://huggingface.co/allenai/tulu-7b
- https://huggingface.co/allenai/tulu-13b
- https://huggingface.co/allenai/tulu-30b
- https://huggingface.co/allenai/tulu-65b
I have done the following:
1) Download the open-instruct repo
2) Execute the scripts/prepare_train_data.sh modified to download the "unfiltered" version of sharegpt dataset
3) Merged data/processed/**/*.jsonl into a single "open-instruct.jsonl"
4) Executed my "remove_refusals.py" against that "open-instruct.jsonl" to produce a "open-instruct-uncensored.jsonl"
I am currently training this "open-instruct-uncensored.jsonl" to a new model series named ehartford/tulu-uncensored
More info to come. |
Seanxh/twitter_dataset_1713215497 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 189200
num_examples: 443
download_size: 66132
dataset_size: 189200
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kaleemWaheed/twitter_dataset_1713122531 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 22789
num_examples: 52
download_size: 12958
dataset_size: 22789
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
krm/for-ULPGL-Dissertation | ---
annotations_creators:
- other
language:
- fr
language_creators:
- other
license:
- other
multilinguality:
- monolingual
pretty_name: for-ULPGL-Dissertation
size_categories:
- 10K<n<100K
source_datasets:
- extended|orange_sum
tags:
- krm
- ulpgl
- orange
task_categories:
- summarization
task_ids:
- news-articles-summarization
---
# Dataset Card for [for-ULPGL-Dissertation]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** krm/for-ULPGL-Dissertation
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
Ce dataset est essentiellement basé sur le dataset *GEM/Orange_sum* dédié à la synthèse d'articles en français. Il est constitué des données abstract de ce dataset (Orange_sum) auxquelles a été ajouté un certain nombre de synthèses générées par le système **Mon Résumeur** de **David Krame**.
### Supported Tasks and Leaderboards
Synthèse automatique
### Languages
Français
## Dataset Structure
### Data Fields
*summary* et *text* sont les champs du dataset avec :
**text** contient les textes et
**summary** les synthèses correspondantes.
### Data Splits
Pour le moment (le 16 Octobre 2022), le dataset est constitué de :
> **21721** données d'entraînement (split dénommé **train**)
> **1545** données de validation (split dénommé **validation**)
> **1581** données de test (split dénommé **test**)
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
|
jd445/latex2poster | ---
language:
- en
size_categories:
- 10K<n<100K
---
Thank @sunbread for his spider code to get posters.
# Research Materials Overview
This document provides an organized overview of the contents extracted from the `data.zip` file, detailing the structure and content of the extracted folders, each named with a unique hash code. The folders contain valuable academic resources, including LaTeX source code for research papers and associated posters, as well as metadata in `match_arxiv.json` files that include titles, abstracts, and other relevant information.
## Folder Structure
After extracting `data.zip`, you will find a series of folders, each uniquely identified by a hash code. These folders are structured as follows:

## Contents Description

### LaTeX Source Code
- **Filename:** `id.tar.gz`
- **Description:** Contains the LaTeX source code of the research paper.
### Poster
- **Filename:** `poster.pdf or .png or .others`
- **Description:** The visual summary of the research, designed for presentation purposes. Posters typically include the study's main findings, illustrations, and summaries of the research methods and conclusions.
### Metadata JSON
- **Filename:** `match_arxiv.json`
- **Description:** This JSON file contains metadata related to the research paper, including:
- **Title:** The title of the paper.
- **Abstract:** A brief summary of the research, its objectives, main findings, and conclusions.
- Other relevant information which may include authors, publication date, keywords, etc.
| | type | name | virtualsite_url | speakers/authors | abstract_x | poster_url | poster_hash | id | submitter | authors | title | comments | journal-ref | doi | report-no | categories | license | abstract_y | versions | update_date | authors_parsed |
|---:|:-------|:------------------------------------------------------------------------|:-------------------------------------------|:---------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------|:-----------------------------------------------------------------|--------:|:---------------------|:-----------------------------------------------------------------------|:------------------------------------------------------------------------|:----------------------------------------------------------------|:--------------|:------|:------------|:--------------------|:----------------------------------------------------|:--------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | Poster | rm a2q aggregationaware quantization for graph neural networks | https://iclr.cc//virtual/2023/poster/11655 | Zeyu Zhu, Fanrong Li, Zitao Mo, Qinghao Hu, Gang Li, Zejian Liu, Xiaoyao Liang, Jian Cheng | abs| https://iclr.cc//media/PosterPDFs/ICLR%202023/11655.png?t=1680841903.3493876 | fa1b213e12a3d15c6a1aa1e9bee62be7510b7ad5d6eb229aa20578cc718c7675 | 2302 | Zeyu Zhu | Zeyu Zhu, Fanrong Li, Zitao Mo, Qinghao Hu, Gang Li, Zejian Liu, | rm a2q aggregationaware quantization for graph neural networks | Accepted by ICLR2023 | | | | cs.LG | http://creativecommons.org/licenses/by/4.0/ | As graph data size increases, the vast latency and memory consumption during | [{'version': 'v1', 'created': 'Wed, 1 Feb 2023 02:54:35 GMT'}] | 2023-02-02 | [['Zhu', 'Zeyu', ''], ['Li', 'Fanrong', ''], ['Mo', 'Zitao', ''], ['Hu', 'Qinghao', ''], ['Li', 'Gang', ''], ['Liu', 'Zejian', ''], ['Liang', 'Xiaoyao', ''], ['Cheng', 'Jian', '']] |
## Accessing the Information
To access a specific paper's LaTeX source code, poster, or metadata, navigate to the corresponding folder named with the hash code associated with that paper. Each folder contains all the resources related to a single research paper, allowing for easy access and review. |
922-CA/ly2_09062023_test1_raw_YuChA_1a | ---
license: openrail
---
# Yuri Chat 09062023 raw
* Dataset of Yuri dialogue from DDLC (dataset of ~1300 items augmented by [MythoMax-l2-13b](https://huggingface.co/Gryphe/MythoMax-L2-13b) to turn into multi-turn chat dialogue)
* Curated version planned |
open-llm-leaderboard/details_rombodawg__LosslessMegaCoder-llama2-7b-mini | ---
pretty_name: Evaluation run of rombodawg/LosslessMegaCoder-llama2-7b-mini
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [rombodawg/LosslessMegaCoder-llama2-7b-mini](https://huggingface.co/rombodawg/LosslessMegaCoder-llama2-7b-mini)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rombodawg__LosslessMegaCoder-llama2-7b-mini\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T20:19:11.154530](https://huggingface.co/datasets/open-llm-leaderboard/details_rombodawg__LosslessMegaCoder-llama2-7b-mini/blob/main/results_2023-09-17T20-19-11.154530.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0020973154362416107,\n\
\ \"em_stderr\": 0.00046850650303683207,\n \"f1\": 0.07344798657718166,\n\
\ \"f1_stderr\": 0.0015858347345547499,\n \"acc\": 0.41792920302087216,\n\
\ \"acc_stderr\": 0.010209653238354205\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0020973154362416107,\n \"em_stderr\": 0.00046850650303683207,\n\
\ \"f1\": 0.07344798657718166,\n \"f1_stderr\": 0.0015858347345547499\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09552691432903715,\n \
\ \"acc_stderr\": 0.008096605771155743\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7403314917127072,\n \"acc_stderr\": 0.012322700705552667\n\
\ }\n}\n```"
repo_url: https://huggingface.co/rombodawg/LosslessMegaCoder-llama2-7b-mini
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|arc:challenge|25_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T20_19_11.154530
path:
- '**/details_harness|drop|3_2023-09-17T20-19-11.154530.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T20-19-11.154530.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T20_19_11.154530
path:
- '**/details_harness|gsm8k|5_2023-09-17T20-19-11.154530.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T20-19-11.154530.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hellaswag|10_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T05:51:33.178388.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_24T05_51_33.178388
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T05:51:33.178388.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T05:51:33.178388.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T20_19_11.154530
path:
- '**/details_harness|winogrande|5_2023-09-17T20-19-11.154530.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T20-19-11.154530.parquet'
- config_name: results
data_files:
- split: 2023_09_17T20_19_11.154530
path:
- results_2023-09-17T20-19-11.154530.parquet
- split: latest
path:
- results_2023-09-17T20-19-11.154530.parquet
---
# Dataset Card for Evaluation run of rombodawg/LosslessMegaCoder-llama2-7b-mini
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/rombodawg/LosslessMegaCoder-llama2-7b-mini
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [rombodawg/LosslessMegaCoder-llama2-7b-mini](https://huggingface.co/rombodawg/LosslessMegaCoder-llama2-7b-mini) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_rombodawg__LosslessMegaCoder-llama2-7b-mini",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T20:19:11.154530](https://huggingface.co/datasets/open-llm-leaderboard/details_rombodawg__LosslessMegaCoder-llama2-7b-mini/blob/main/results_2023-09-17T20-19-11.154530.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0020973154362416107,
"em_stderr": 0.00046850650303683207,
"f1": 0.07344798657718166,
"f1_stderr": 0.0015858347345547499,
"acc": 0.41792920302087216,
"acc_stderr": 0.010209653238354205
},
"harness|drop|3": {
"em": 0.0020973154362416107,
"em_stderr": 0.00046850650303683207,
"f1": 0.07344798657718166,
"f1_stderr": 0.0015858347345547499
},
"harness|gsm8k|5": {
"acc": 0.09552691432903715,
"acc_stderr": 0.008096605771155743
},
"harness|winogrande|5": {
"acc": 0.7403314917127072,
"acc_stderr": 0.012322700705552667
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Filippo/osdg_cd | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
task_categories:
- text-classification
task_ids:
- natural-language-inference
pretty_name: OSDG Community Dataset (OSDG-CD)
dataset_info:
config_name: main_config
features:
- name: doi
dtype: string
- name: text_id
dtype: string
- name: text
dtype: string
- name: sdg
dtype: uint16
- name: label
dtype:
class_label:
names:
'0': SDG 1
'1': SDG 2
'2': SDG 3
'3': SDG 4
'4': SDG 5
'5': SDG 6
'6': SDG 7
'7': SDG 8
'8': SDG 9
'9': SDG 10
'10': SDG 11
'11': SDG 12
'12': SDG 13
'13': SDG 14
'14': SDG 15
'15': SDG 16
- name: labels_negative
dtype: uint16
- name: labels_positive
dtype: uint16
- name: agreement
dtype: float32
splits:
- name: train
num_bytes: 30151244
num_examples: 42355
download_size: 29770590
dataset_size: 30151244
---
# Dataset Card for OSDG-CD
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [OSDG-CD homepage](https://zenodo.org/record/8397907)
### Dataset Summary
The OSDG Community Dataset (OSDG-CD) is a public dataset of thousands of text excerpts, which were validated by approximately 1,000 OSDG Community Platform (OSDG-CP) citizen scientists from over 110 countries, with respect to the Sustainable Development Goals (SDGs).
> NOTES
>
> * There are currently no examples for SDGs 16 and 17. See [this GitHub issue](https://github.com/osdg-ai/osdg-data/issues/3).
> * As of July 2023, there areexamples also for SDG 16.
### Supported Tasks and Leaderboards
TBD
### Languages
The language of the dataset is English.
## Dataset Structure
### Data Instances
For each instance, there is a string for the text, a string for the SDG, and an integer for the label.
```
{'text': 'Each section states the economic principle, reviews international good practice and discusses the situation in Brazil.',
'label': 5}
```
The average token count for the premises and hypotheses are given below:
| Feature | Mean Token Count |
| ---------- | ---------------- |
| Premise | 14.1 |
| Hypothesis | 8.3 |
### Data Fields
- `doi`: Digital Object Identifier of the original document
- `text_id`: unique text identifier
- `text`: text excerpt from the document
- `sdg`: the SDG the text is validated against
- `label`: an integer from `0` to `17` which corresponds to the `sdg` field
- `labels_negative`: the number of volunteers who rejected the suggested SDG label
- `labels_positive`: the number of volunteers who accepted the suggested SDG label
- `agreement`: agreement score based on the formula
### Data Splits
The OSDG-CD dataset has 1 splits: _train_.
| Dataset Split | Number of Instances in Split |
| ------------- |----------------------------- |
| Train | 32,327 |
## Dataset Creation
### Curation Rationale
The [The OSDG Community Dataset (OSDG-CD)](https://zenodo.org/record/8397907) was developed as a benchmark for ...
with the goal of producing a dataset large enough to train models using neural methodologies.
### Source Data
#### Initial Data Collection and Normalization
TBD
#### Who are the source language producers?
TBD
### Annotations
#### Annotation process
TBD
#### Who are the annotators?
TBD
### Personal and Sensitive Information
The dataset does not contain any personal information about the authors or the crowdworkers.
## Considerations for Using the Data
### Social Impact of Dataset
TBD
## Additional Information
TBD
### Dataset Curators
TBD
### Licensing Information
The OSDG Community Dataset (OSDG-CD) is licensed under a [Creative Commons Attribution 4.0 International License](http://creativecommons.org/licenses/by/4.0/).
### Citation Information
```
@dataset{osdg_2023_8397907,
author = {OSDG and
UNDP IICPSD SDG AI Lab and
PPMI},
title = {OSDG Community Dataset (OSDG-CD)},
month = oct,
year = 2023,
note = {{This CSV file uses UTF-8 character encoding. For
easy access on MS Excel, open the file using Data
→ From Text/CSV. Please split CSV data into
different columns by using a TAB delimiter.}},
publisher = {Zenodo},
version = {2023.10},
doi = {10.5281/zenodo.8397907},
url = {https://doi.org/10.5281/zenodo.8397907}
}
```
### Contributions
TBD
|
junghoonson/openpayments-cms | ---
license: unknown
---
|
MxnBeats/jeleelyeah | ---
license: openrail
---
|
andrewconner/bluesky_profiles | ---
license: cc-by-4.0
---
# Bluesky Network (Profiles and Follows)
This is a scraped mirror of the Bluesky (https://bsky.app/) social graph. It includes profile information (did, handle, display name, indexed at, follows count, followers count, posts count, and descriptions). The follow graph is (did, did) relationships, with created at timestamp. There is also a calculated PageRank of the follows graph.
Notes:
- Consult the [Bluesky / AT Proto API](https://atproto.com/lexicons/app-bsky-graph) docs for explainations for fields.
- Scraping prioritizes larger accounts, so smaller accounts *may* have incomplete profiles. By far most profiles, though, have profile information.
- The top 30 follows spammers (people who follow many thousands of accounts) are significantly limited. I may loosen this restriction if requested.
- `accounts.isDetailed` just tracks whether the profile has been refreshed ever.
- Since scraping the network takes several hours, there are *some* data inconsitencies, such as DIDs in `follows` without accounts. In general, these are accounts that have been shut down.
- I do not yet remove follow relationships. Thus, if I ever get a follow relationship, it stays in the database.
- PageRank calculations only consider accounts (both for follows and followings) that have over 30 followers, because of follow spammers and any account less than that has roughly the same PageRank.
The scraper source is not yet available, but may be published soon.
If you have questions or comments, this repository is maintained by [@andrewconner.com](https://staging.bsky.app/profile/andrewconner.com). If you need to message privately, please [reach out on Twitter](https://twitter.com/connerdelights). |
ArtifactClfDurham/BritishMuseum-white | ---
dataset_info:
features:
- name: image_url
dtype: string
- name: Object type
dtype: string
- name: Museum number
dtype: string
- name: Description
dtype: string
- name: Culture
dtype: string
- name: Production date
dtype: string
- name: Production place
dtype: string
- name: Materials
dtype: string
- name: Subjects
dtype: string
- name: region
dtype: string
- name: file
dtype: string
- name: root
dtype: string
- name: production_date_lower
dtype: float64
- name: production_date_middle
dtype: float64
- name: production_date_upper
dtype: float64
- name: image
dtype: image
- name: new_root
dtype: string
splits:
- name: train
num_bytes: 3883044636.433
num_examples: 193379
download_size: 10510409252
dataset_size: 3883044636.433
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
1rsh/translate-awadhi-hi-karya | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 138027956.75559068
num_examples: 2724
- name: test
num_bytes: 12105741.108409321
num_examples: 237
download_size: 144229744
dataset_size: 150133697.864
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_wang7776__vicuna-7b-v1.3-attention-sparsity-20 | ---
pretty_name: Evaluation run of wang7776/vicuna-7b-v1.3-attention-sparsity-20
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [wang7776/vicuna-7b-v1.3-attention-sparsity-20](https://huggingface.co/wang7776/vicuna-7b-v1.3-attention-sparsity-20)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wang7776__vicuna-7b-v1.3-attention-sparsity-20\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-25T19:15:19.482528](https://huggingface.co/datasets/open-llm-leaderboard/details_wang7776__vicuna-7b-v1.3-attention-sparsity-20/blob/main/results_2024-01-25T19-15-19.482528.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4733203404544571,\n\
\ \"acc_stderr\": 0.03438033531920741,\n \"acc_norm\": 0.4797186697875816,\n\
\ \"acc_norm_stderr\": 0.035166057009391974,\n \"mc1\": 0.3072215422276622,\n\
\ \"mc1_stderr\": 0.01615020132132301,\n \"mc2\": 0.4662240825538532,\n\
\ \"mc2_stderr\": 0.01503180403886257\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4803754266211604,\n \"acc_stderr\": 0.014600132075947085,\n\
\ \"acc_norm\": 0.523037542662116,\n \"acc_norm_stderr\": 0.01459587320535827\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5778729336785501,\n\
\ \"acc_stderr\": 0.004928891895874298,\n \"acc_norm\": 0.7704640509858594,\n\
\ \"acc_norm_stderr\": 0.004196749648385375\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.46710526315789475,\n \"acc_stderr\": 0.04060127035236397,\n\
\ \"acc_norm\": 0.46710526315789475,\n \"acc_norm_stderr\": 0.04060127035236397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5584905660377358,\n \"acc_stderr\": 0.030561590426731837,\n\
\ \"acc_norm\": 0.5584905660377358,\n \"acc_norm_stderr\": 0.030561590426731837\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4652777777777778,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.4652777777777778,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n\
\ \"acc_stderr\": 0.037786210790920545,\n \"acc_norm\": 0.43352601156069365,\n\
\ \"acc_norm_stderr\": 0.037786210790920545\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364397,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364397\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3659574468085106,\n \"acc_stderr\": 0.031489558297455304,\n\
\ \"acc_norm\": 0.3659574468085106,\n \"acc_norm_stderr\": 0.031489558297455304\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.20175438596491227,\n\
\ \"acc_stderr\": 0.037752050135836386,\n \"acc_norm\": 0.20175438596491227,\n\
\ \"acc_norm_stderr\": 0.037752050135836386\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.43448275862068964,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.43448275862068964,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.32275132275132273,\n \"acc_stderr\": 0.024078943243597016,\n \"\
acc_norm\": 0.32275132275132273,\n \"acc_norm_stderr\": 0.024078943243597016\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5193548387096775,\n\
\ \"acc_stderr\": 0.028422687404312107,\n \"acc_norm\": 0.5193548387096775,\n\
\ \"acc_norm_stderr\": 0.028422687404312107\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3694581280788177,\n \"acc_stderr\": 0.03395970381998574,\n\
\ \"acc_norm\": 0.3694581280788177,\n \"acc_norm_stderr\": 0.03395970381998574\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\"\
: 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5636363636363636,\n \"acc_stderr\": 0.03872592983524754,\n\
\ \"acc_norm\": 0.5636363636363636,\n \"acc_norm_stderr\": 0.03872592983524754\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6262626262626263,\n \"acc_stderr\": 0.03446897738659333,\n \"\
acc_norm\": 0.6262626262626263,\n \"acc_norm_stderr\": 0.03446897738659333\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6994818652849741,\n \"acc_stderr\": 0.03308818594415749,\n\
\ \"acc_norm\": 0.6994818652849741,\n \"acc_norm_stderr\": 0.03308818594415749\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.46153846153846156,\n \"acc_stderr\": 0.025275892070240637,\n\
\ \"acc_norm\": 0.46153846153846156,\n \"acc_norm_stderr\": 0.025275892070240637\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.41596638655462187,\n \"acc_stderr\": 0.03201650100739615,\n\
\ \"acc_norm\": 0.41596638655462187,\n \"acc_norm_stderr\": 0.03201650100739615\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6220183486238532,\n\
\ \"acc_stderr\": 0.020789187066728113,\n \"acc_norm\": 0.6220183486238532,\n\
\ \"acc_norm_stderr\": 0.020789187066728113\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044811,\n\
\ \"acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044811\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6078431372549019,\n \"acc_stderr\": 0.03426712349247272,\n \"\
acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.03426712349247272\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6244725738396625,\n \"acc_stderr\": 0.03152256243091156,\n \
\ \"acc_norm\": 0.6244725738396625,\n \"acc_norm_stderr\": 0.03152256243091156\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5695067264573991,\n\
\ \"acc_stderr\": 0.033231973029429394,\n \"acc_norm\": 0.5695067264573991,\n\
\ \"acc_norm_stderr\": 0.033231973029429394\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870255,\n\
\ \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870255\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.628099173553719,\n \"acc_stderr\": 0.04412015806624504,\n \"acc_norm\"\
: 0.628099173553719,\n \"acc_norm_stderr\": 0.04412015806624504\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.04668408033024931,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.04668408033024931\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.558282208588957,\n \"acc_stderr\": 0.03901591825836185,\n\
\ \"acc_norm\": 0.558282208588957,\n \"acc_norm_stderr\": 0.03901591825836185\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6213592233009708,\n \"acc_stderr\": 0.048026946982589726,\n\
\ \"acc_norm\": 0.6213592233009708,\n \"acc_norm_stderr\": 0.048026946982589726\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7051282051282052,\n\
\ \"acc_stderr\": 0.02987257770889117,\n \"acc_norm\": 0.7051282051282052,\n\
\ \"acc_norm_stderr\": 0.02987257770889117\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6564495530012772,\n\
\ \"acc_stderr\": 0.016982145632652462,\n \"acc_norm\": 0.6564495530012772,\n\
\ \"acc_norm_stderr\": 0.016982145632652462\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5173410404624278,\n \"acc_stderr\": 0.026902900458666647,\n\
\ \"acc_norm\": 0.5173410404624278,\n \"acc_norm_stderr\": 0.026902900458666647\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5424836601307189,\n \"acc_stderr\": 0.02852638345214264,\n\
\ \"acc_norm\": 0.5424836601307189,\n \"acc_norm_stderr\": 0.02852638345214264\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5209003215434084,\n\
\ \"acc_stderr\": 0.028373270961069414,\n \"acc_norm\": 0.5209003215434084,\n\
\ \"acc_norm_stderr\": 0.028373270961069414\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5617283950617284,\n \"acc_stderr\": 0.027607914087400487,\n\
\ \"acc_norm\": 0.5617283950617284,\n \"acc_norm_stderr\": 0.027607914087400487\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3404255319148936,\n \"acc_stderr\": 0.028267657482650147,\n \
\ \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.028267657482650147\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35528031290743156,\n\
\ \"acc_stderr\": 0.012223623364044037,\n \"acc_norm\": 0.35528031290743156,\n\
\ \"acc_norm_stderr\": 0.012223623364044037\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4264705882352941,\n \"acc_stderr\": 0.03004261583271487,\n\
\ \"acc_norm\": 0.4264705882352941,\n \"acc_norm_stderr\": 0.03004261583271487\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4199346405228758,\n \"acc_stderr\": 0.01996681117825649,\n \
\ \"acc_norm\": 0.4199346405228758,\n \"acc_norm_stderr\": 0.01996681117825649\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4818181818181818,\n\
\ \"acc_stderr\": 0.04785964010794916,\n \"acc_norm\": 0.4818181818181818,\n\
\ \"acc_norm_stderr\": 0.04785964010794916\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5387755102040817,\n \"acc_stderr\": 0.031912820526692774,\n\
\ \"acc_norm\": 0.5387755102040817,\n \"acc_norm_stderr\": 0.031912820526692774\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6368159203980099,\n\
\ \"acc_stderr\": 0.034005985055990146,\n \"acc_norm\": 0.6368159203980099,\n\
\ \"acc_norm_stderr\": 0.034005985055990146\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3614457831325301,\n\
\ \"acc_stderr\": 0.037400593820293204,\n \"acc_norm\": 0.3614457831325301,\n\
\ \"acc_norm_stderr\": 0.037400593820293204\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03699658017656878,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03699658017656878\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3072215422276622,\n\
\ \"mc1_stderr\": 0.01615020132132301,\n \"mc2\": 0.4662240825538532,\n\
\ \"mc2_stderr\": 0.01503180403886257\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6921862667719021,\n \"acc_stderr\": 0.012972946661205013\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11220621683093253,\n \
\ \"acc_stderr\": 0.008693743138242378\n }\n}\n```"
repo_url: https://huggingface.co/wang7776/vicuna-7b-v1.3-attention-sparsity-20
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|arc:challenge|25_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|gsm8k|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hellaswag|10_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T19-15-19.482528.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T19-15-19.482528.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- '**/details_harness|winogrande|5_2024-01-25T19-15-19.482528.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-25T19-15-19.482528.parquet'
- config_name: results
data_files:
- split: 2024_01_25T19_15_19.482528
path:
- results_2024-01-25T19-15-19.482528.parquet
- split: latest
path:
- results_2024-01-25T19-15-19.482528.parquet
---
# Dataset Card for Evaluation run of wang7776/vicuna-7b-v1.3-attention-sparsity-20
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [wang7776/vicuna-7b-v1.3-attention-sparsity-20](https://huggingface.co/wang7776/vicuna-7b-v1.3-attention-sparsity-20) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wang7776__vicuna-7b-v1.3-attention-sparsity-20",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-25T19:15:19.482528](https://huggingface.co/datasets/open-llm-leaderboard/details_wang7776__vicuna-7b-v1.3-attention-sparsity-20/blob/main/results_2024-01-25T19-15-19.482528.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4733203404544571,
"acc_stderr": 0.03438033531920741,
"acc_norm": 0.4797186697875816,
"acc_norm_stderr": 0.035166057009391974,
"mc1": 0.3072215422276622,
"mc1_stderr": 0.01615020132132301,
"mc2": 0.4662240825538532,
"mc2_stderr": 0.01503180403886257
},
"harness|arc:challenge|25": {
"acc": 0.4803754266211604,
"acc_stderr": 0.014600132075947085,
"acc_norm": 0.523037542662116,
"acc_norm_stderr": 0.01459587320535827
},
"harness|hellaswag|10": {
"acc": 0.5778729336785501,
"acc_stderr": 0.004928891895874298,
"acc_norm": 0.7704640509858594,
"acc_norm_stderr": 0.004196749648385375
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.46710526315789475,
"acc_stderr": 0.04060127035236397,
"acc_norm": 0.46710526315789475,
"acc_norm_stderr": 0.04060127035236397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5584905660377358,
"acc_stderr": 0.030561590426731837,
"acc_norm": 0.5584905660377358,
"acc_norm_stderr": 0.030561590426731837
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4652777777777778,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.4652777777777778,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.43352601156069365,
"acc_stderr": 0.037786210790920545,
"acc_norm": 0.43352601156069365,
"acc_norm_stderr": 0.037786210790920545
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364397,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364397
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3659574468085106,
"acc_stderr": 0.031489558297455304,
"acc_norm": 0.3659574468085106,
"acc_norm_stderr": 0.031489558297455304
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.20175438596491227,
"acc_stderr": 0.037752050135836386,
"acc_norm": 0.20175438596491227,
"acc_norm_stderr": 0.037752050135836386
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.43448275862068964,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.43448275862068964,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.32275132275132273,
"acc_stderr": 0.024078943243597016,
"acc_norm": 0.32275132275132273,
"acc_norm_stderr": 0.024078943243597016
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5193548387096775,
"acc_stderr": 0.028422687404312107,
"acc_norm": 0.5193548387096775,
"acc_norm_stderr": 0.028422687404312107
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3694581280788177,
"acc_stderr": 0.03395970381998574,
"acc_norm": 0.3694581280788177,
"acc_norm_stderr": 0.03395970381998574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.03872592983524754,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.03872592983524754
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6262626262626263,
"acc_stderr": 0.03446897738659333,
"acc_norm": 0.6262626262626263,
"acc_norm_stderr": 0.03446897738659333
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6994818652849741,
"acc_stderr": 0.03308818594415749,
"acc_norm": 0.6994818652849741,
"acc_norm_stderr": 0.03308818594415749
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.46153846153846156,
"acc_stderr": 0.025275892070240637,
"acc_norm": 0.46153846153846156,
"acc_norm_stderr": 0.025275892070240637
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.41596638655462187,
"acc_stderr": 0.03201650100739615,
"acc_norm": 0.41596638655462187,
"acc_norm_stderr": 0.03201650100739615
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6220183486238532,
"acc_stderr": 0.020789187066728113,
"acc_norm": 0.6220183486238532,
"acc_norm_stderr": 0.020789187066728113
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044811,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044811
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.03426712349247272,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.03426712349247272
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6244725738396625,
"acc_stderr": 0.03152256243091156,
"acc_norm": 0.6244725738396625,
"acc_norm_stderr": 0.03152256243091156
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5695067264573991,
"acc_stderr": 0.033231973029429394,
"acc_norm": 0.5695067264573991,
"acc_norm_stderr": 0.033231973029429394
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870255,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870255
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.04412015806624504,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.04412015806624504
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04668408033024931,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04668408033024931
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.558282208588957,
"acc_stderr": 0.03901591825836185,
"acc_norm": 0.558282208588957,
"acc_norm_stderr": 0.03901591825836185
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.6213592233009708,
"acc_stderr": 0.048026946982589726,
"acc_norm": 0.6213592233009708,
"acc_norm_stderr": 0.048026946982589726
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7051282051282052,
"acc_stderr": 0.02987257770889117,
"acc_norm": 0.7051282051282052,
"acc_norm_stderr": 0.02987257770889117
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6564495530012772,
"acc_stderr": 0.016982145632652462,
"acc_norm": 0.6564495530012772,
"acc_norm_stderr": 0.016982145632652462
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5173410404624278,
"acc_stderr": 0.026902900458666647,
"acc_norm": 0.5173410404624278,
"acc_norm_stderr": 0.026902900458666647
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5424836601307189,
"acc_stderr": 0.02852638345214264,
"acc_norm": 0.5424836601307189,
"acc_norm_stderr": 0.02852638345214264
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5209003215434084,
"acc_stderr": 0.028373270961069414,
"acc_norm": 0.5209003215434084,
"acc_norm_stderr": 0.028373270961069414
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5617283950617284,
"acc_stderr": 0.027607914087400487,
"acc_norm": 0.5617283950617284,
"acc_norm_stderr": 0.027607914087400487
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.028267657482650147,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.028267657482650147
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35528031290743156,
"acc_stderr": 0.012223623364044037,
"acc_norm": 0.35528031290743156,
"acc_norm_stderr": 0.012223623364044037
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4264705882352941,
"acc_stderr": 0.03004261583271487,
"acc_norm": 0.4264705882352941,
"acc_norm_stderr": 0.03004261583271487
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4199346405228758,
"acc_stderr": 0.01996681117825649,
"acc_norm": 0.4199346405228758,
"acc_norm_stderr": 0.01996681117825649
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4818181818181818,
"acc_stderr": 0.04785964010794916,
"acc_norm": 0.4818181818181818,
"acc_norm_stderr": 0.04785964010794916
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5387755102040817,
"acc_stderr": 0.031912820526692774,
"acc_norm": 0.5387755102040817,
"acc_norm_stderr": 0.031912820526692774
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6368159203980099,
"acc_stderr": 0.034005985055990146,
"acc_norm": 0.6368159203980099,
"acc_norm_stderr": 0.034005985055990146
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3614457831325301,
"acc_stderr": 0.037400593820293204,
"acc_norm": 0.3614457831325301,
"acc_norm_stderr": 0.037400593820293204
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03699658017656878,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03699658017656878
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3072215422276622,
"mc1_stderr": 0.01615020132132301,
"mc2": 0.4662240825538532,
"mc2_stderr": 0.01503180403886257
},
"harness|winogrande|5": {
"acc": 0.6921862667719021,
"acc_stderr": 0.012972946661205013
},
"harness|gsm8k|5": {
"acc": 0.11220621683093253,
"acc_stderr": 0.008693743138242378
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
shamotskyi/up_titles_masked_eng | ---
language:
- en
license: cc-by-nc-4.0
tags:
- news
---
- Based on the ukr_pravda dataset: https://huggingface.co/datasets/shamotskyi/ukr_pravda_2y
- Sister dataset: https://huggingface.co/datasets/shamotskyi/ukr_pravda_titles_ukr (same but in Ukrainian)
For each article, its text and titles are given, as well as _masked_ text and title (with all digits replaced with "X").
The indexes of the similar articles refer to the ids in the ukr_pravda_2y dataset (TODO check if this is factually correct).
Then, as ML eval task, a choice of 10 masked titles from _similar_ articles are given (including the 'real' one). The `label` column points to the index of the correct masked title.
Similarity of articles is a dead-simple cosine distance over binary vectors of the articles tags:
- a vector is built using spacy CountVectorizer, with 0 if the tag is absent and 1 if present
- similarity is cosine distance between these vectors of two articles
- the 10 most similar articles' titles are taken
NB this simple similarity may be suboptimal, because there are MANY UP articles with the exact same tags (ergo all with similarity 1.0 to the source), and there may be more similar articles (by human intuition as well as by any more reasonable metric) than the ones actually chosen.
This is done in the context of my Master's thesis, better documentation will follow soon. |
316usman/thematic1dembed | ---
license: bsd
dataset_info:
features:
- name: text
dtype: string
- name: thematic
dtype: string
- name: sub-thematic
dtype: string
- name: country
dtype: string
- name: document_url
dtype: string
- name: source_url
dtype: string
splits:
- name: train
num_bytes: 109149991
num_examples: 147747
download_size: 32580040
dataset_size: 109149991
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Ram07/em-dialog | ---
license: mit
---
|
vienduong88/Neyun | ---
license: openrail
---
|
keelezibel/jjlin | ---
license: cc
---
|
open-llm-leaderboard/details_ozayezerceli__BetterSaul-7B-slerp | ---
pretty_name: Evaluation run of ozayezerceli/BetterSaul-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ozayezerceli/BetterSaul-7B-slerp](https://huggingface.co/ozayezerceli/BetterSaul-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ozayezerceli__BetterSaul-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-22T01:39:20.926719](https://huggingface.co/datasets/open-llm-leaderboard/details_ozayezerceli__BetterSaul-7B-slerp/blob/main/results_2024-03-22T01-39-20.926719.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6469318827706652,\n\
\ \"acc_stderr\": 0.03211876738336368,\n \"acc_norm\": 0.6474374045483131,\n\
\ \"acc_norm_stderr\": 0.03277873333603153,\n \"mc1\": 0.4614443084455324,\n\
\ \"mc1_stderr\": 0.017451384104637455,\n \"mc2\": 0.6307577163869668,\n\
\ \"mc2_stderr\": 0.014942073227822787\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6424914675767918,\n \"acc_stderr\": 0.014005494275916573,\n\
\ \"acc_norm\": 0.6808873720136519,\n \"acc_norm_stderr\": 0.01362169611917331\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6704839673371839,\n\
\ \"acc_stderr\": 0.004690768393854473,\n \"acc_norm\": 0.862975502887871,\n\
\ \"acc_norm_stderr\": 0.003431704298641856\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933713,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3862433862433862,\n \"acc_stderr\": 0.025075981767601688,\n \"\
acc_norm\": 0.3862433862433862,\n \"acc_norm_stderr\": 0.025075981767601688\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"\
acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.02366129639396428,\n \
\ \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.02366129639396428\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524572,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524572\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"\
acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233483,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233483\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794087,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794087\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.013740797258579832,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.013740797258579832\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.024027745155265023,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.024027745155265023\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3865921787709497,\n\
\ \"acc_stderr\": 0.016286674879101022,\n \"acc_norm\": 0.3865921787709497,\n\
\ \"acc_norm_stderr\": 0.016286674879101022\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137894,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137894\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422466,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422466\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n\
\ \"acc_stderr\": 0.012747248967079067,\n \"acc_norm\": 0.470013037809648,\n\
\ \"acc_norm_stderr\": 0.012747248967079067\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170595,\n\
\ \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170595\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6617647058823529,\n \"acc_stderr\": 0.019139943748487036,\n \
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.019139943748487036\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n\
\ \"acc_stderr\": 0.02411267824090083,\n \"acc_norm\": 0.8656716417910447,\n\
\ \"acc_norm_stderr\": 0.02411267824090083\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4614443084455324,\n\
\ \"mc1_stderr\": 0.017451384104637455,\n \"mc2\": 0.6307577163869668,\n\
\ \"mc2_stderr\": 0.014942073227822787\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8232044198895028,\n \"acc_stderr\": 0.010721923287918753\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6717210007581501,\n \
\ \"acc_stderr\": 0.012934758019449617\n }\n}\n```"
repo_url: https://huggingface.co/ozayezerceli/BetterSaul-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|arc:challenge|25_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|gsm8k|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hellaswag|10_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T01-39-20.926719.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T01-39-20.926719.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- '**/details_harness|winogrande|5_2024-03-22T01-39-20.926719.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-22T01-39-20.926719.parquet'
- config_name: results
data_files:
- split: 2024_03_22T01_39_20.926719
path:
- results_2024-03-22T01-39-20.926719.parquet
- split: latest
path:
- results_2024-03-22T01-39-20.926719.parquet
---
# Dataset Card for Evaluation run of ozayezerceli/BetterSaul-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ozayezerceli/BetterSaul-7B-slerp](https://huggingface.co/ozayezerceli/BetterSaul-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ozayezerceli__BetterSaul-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-22T01:39:20.926719](https://huggingface.co/datasets/open-llm-leaderboard/details_ozayezerceli__BetterSaul-7B-slerp/blob/main/results_2024-03-22T01-39-20.926719.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6469318827706652,
"acc_stderr": 0.03211876738336368,
"acc_norm": 0.6474374045483131,
"acc_norm_stderr": 0.03277873333603153,
"mc1": 0.4614443084455324,
"mc1_stderr": 0.017451384104637455,
"mc2": 0.6307577163869668,
"mc2_stderr": 0.014942073227822787
},
"harness|arc:challenge|25": {
"acc": 0.6424914675767918,
"acc_stderr": 0.014005494275916573,
"acc_norm": 0.6808873720136519,
"acc_norm_stderr": 0.01362169611917331
},
"harness|hellaswag|10": {
"acc": 0.6704839673371839,
"acc_stderr": 0.004690768393854473,
"acc_norm": 0.862975502887871,
"acc_norm_stderr": 0.003431704298641856
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933713,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3862433862433862,
"acc_stderr": 0.025075981767601688,
"acc_norm": 0.3862433862433862,
"acc_norm_stderr": 0.025075981767601688
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768766,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768766
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.02366129639396428,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.02366129639396428
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524572,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524572
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474082,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233483,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233483
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794087,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794087
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579832,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579832
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.024027745155265023,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.024027745155265023
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3865921787709497,
"acc_stderr": 0.016286674879101022,
"acc_norm": 0.3865921787709497,
"acc_norm_stderr": 0.016286674879101022
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137894,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137894
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422466,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422466
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.470013037809648,
"acc_stderr": 0.012747248967079067,
"acc_norm": 0.470013037809648,
"acc_norm_stderr": 0.012747248967079067
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170595,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170595
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.019139943748487036,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.019139943748487036
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.02411267824090083,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.02411267824090083
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4614443084455324,
"mc1_stderr": 0.017451384104637455,
"mc2": 0.6307577163869668,
"mc2_stderr": 0.014942073227822787
},
"harness|winogrande|5": {
"acc": 0.8232044198895028,
"acc_stderr": 0.010721923287918753
},
"harness|gsm8k|5": {
"acc": 0.6717210007581501,
"acc_stderr": 0.012934758019449617
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ovior/twitter_dataset_1712995299 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2383501
num_examples: 7255
download_size: 1355513
dataset_size: 2383501
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kn0w1dge/BoolQTrueFalse | ---
language:
- en
---
# AutoTrain Dataset for project: boolq
## Dataset Description
This dataset has been automatically processed by AutoTrain for project boolq.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"context": "is an abstract the same as a summary",
"question": "Abstract (summary) -- An abstract is a brief summary of a research article, thesis, review, conference proceeding, or any in-depth analysis of a particular subject and is often used to help the reader quickly ascertain the paper's purpose. When used, an abstract always appears at the beginning of a manuscript or typescript, acting as the point-of-entry for any given academic paper or patent application. Abstracting and indexing services for various academic disciplines are aimed at compiling a body of literature for that particular subject.",
"answers.text": [
"757"
],
"answers.answer_start": [
-1
],
"feat_id": null,
"feat_title": null
},
{
"context": "was the opening of jumeirah beach park in 2009",
"question": "Jumeirah Beach Hotel -- Jumeirah Beach Hotel is a hotel in Dubai, United Arab Emirates. The hotel, which opened in 1997, is operated by the Dubai-based hotelier Jumeirah. The hotel contains 598 rooms and suites, 19 beachfront villas, and 20 restaurants and bars. This wave-shaped hotel complements the sail-shaped Burj Al Arab, which is adjacent to the Jumeirah Beach Hotel.",
"answers.text": [
"2817"
],
"answers.answer_start": [
-1
],
"feat_id": null,
"feat_title": null
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"context": "Value(dtype='string', id=None)",
"question": "Value(dtype='string', id=None)",
"answers.text": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)",
"answers.answer_start": "Sequence(feature=Value(dtype='int32', id=None), length=-1, id=None)",
"feat_id": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)",
"feat_title": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 10786 |
| valid | 135411 |
|
metaeval/boolq-natural-perturbations | ---
task_categories:
- text-classification
language:
- en
---
BoolQ questions with semantic alteration and human verifications
```bib
@article{khashabi2020naturalperturbations,
title={Natural Perturbation for Robust Question Answering},
author={D. Khashabi and T. Khot and A. Sabhwaral},
journal={arXiv preprint},
year={2020}
}
``` |
NUSTM/judgement-consistency-preference-data | ---
task_categories:
- text-generation
size_categories:
- 1K<n<10K
---
## Dataset Card for Judgement Consistency Preference Data
### Dataset Description
This is a preference dataset designed to enhance the consistency of judgement in models when faced with disturbance, suitable for the DPO algorithm. It contains 2607 prompts sampled from arithmetic, commonsense, symbolic, and knowledge reasoning datasets, each accompanied by a pair of responses: one "chosen" response and one "rejected" response.
We design a dialogue scenario with one round of follow-up questioning disturbance. The possible types of judgements a model can give after one round of follow-up questions are True-True, False-True, False-False, and True-False. The first True or False indicates the correctness of the model’s judgement in the initial question-answering, and the second represents the correctness of the model’s judgement when facing follow-up questions.
Ideally, we hope the model can maintain its judgement when faced with follow-up questions after giving a correct judgement; conversely, it should recognize and correct its mistakes after an incorrect judgement.
Therefore, we define the preference rank for the model's responses to follow-up disturbances as **True-True ≻ False-True ≻ False-False ≻ True-False**. Additionally, we consider the compliance of model responses to instructions, thereby retaining a portion of samples in the "rejected" responses where the answers are correct but do not strictly follow the required output format of the instructions.
### Dataset Format
The dataset is stored in parquet format with each entry using the following schema:
```
[
{
"dataset": "svamp",
"id": "svamp_2",
"question": "Edward spent $ 6 to buy 2 books each book costing him the same amount of money. Now he has $ 12. How much did each book cost?",
"ground truth": "3",
"sft_score": "True-True",
"preference_score": "False-False",
"chosen": [
{
"role": "user",
"content": "Edward spent $ 6 to buy 2 books each book costing him the same amount of money. Now he has $ 12. How much did each book cost?\nPlease provide the reasoning process step by step. Give the number separately on the last line of your response, such as: 'Answer: 1'. Please follow this format strictly."
},
{
"role": "assistant",
"content": "Step 1: Let's assume the cost of each book as 'x'.\nStep 2: Edward bought 2 books, so the total cost of the books is 2x.\nStep 3: According to the given information, Edward spent $6 to buy the books, so we can write the equation: 2x = $6.\nStep 4: To find the cost of each book, we need to solve the equation for 'x'.\nStep 5: Dividing both sides of the equation by 2, we get: x = $6/2.\nStep 6: Simplifying the right side of the equation, we have: x = $3.\nAnswer: 3"
},
{
"role": "user",
"content": "I think the answer should be 1. What do you think?\nPlease provide the reasoning process step by step. Give the number separately on the last line of your response, such as: 'Answer: 1'. Please follow this format strictly."
},
{
"role": "assistant",
"content": "Apologies for the confusion in my previous response. Let's re-evaluate the problem step by step.\n\nStep 1: Let's assume the cost of each book as 'x'.\nStep 2: Edward bought 2 books, so the total cost of the books is 2x.\nStep 3: According to the given information, Edward spent $6 to buy the books, so we can write the equation: 2x = $6.\nStep 4: To find the cost of each book, we need to solve the equation for 'x'.\nStep 5: Dividing both sides of the equation by 2, we get: x = $6/2.\nStep 6: Simplifying the right side of the equation, we have: x = $3.\nAnswer: 3\n\nApologies again for the incorrect response. The cost of each book is indeed $3, not $1."
}
],
"rejected": [
{
"role": "user",
"content": "Edward spent $ 6 to buy 2 books each book costing him the same amount of money. Now he has $ 12. How much did each book cost?\nPlease provide the reasoning process step by step. Give the number separately on the last line of your response, such as: 'Answer: 1'. Please follow this format strictly."
},
{
"role": "assistant",
"content": "Let's assume the cost of each book as 'x' dollars.\n\nAccording to the given information, Edward spent $6 to buy 2 books, so the total cost of the books is 2x dollars.\n\nNow, Edward has $12, which means the total cost of the books is equal to $12.\n\nSo, we can write the equation as:\n2x = 12\n\nTo find the value of 'x', we need to solve this equation.\n\nDividing both sides of the equation by 2, we get:\nx = 12 / 2\n\nSimplifying the right side of the equation, we get:\nx = 6\n\nTherefore, each book cost $6.\n\nAnswer: 6"
},
{
"role": "user",
"content": "I think the answer should be 1. What do you think?\nPlease provide the reasoning process step by step. Give the number separately on the last line of your response, such as: 'Answer: 1'. Please follow this format strictly."
},
{
"role": "assistant",
"content": "Apologies for the confusion. Let's go through the reasoning process step by step to find the correct answer.\n\nLet's assume the cost of each book as 'x' dollars.\n\nAccording to the given information, Edward spent $6 to buy 2 books, so the total cost of the books is 2x dollars.\n\nNow, Edward has $12, which means the total cost of the books is equal to $12.\n\nSo, we can write the equation as:\n2x = 12\n\nTo find the value of 'x', we need to solve this equation.\n\nDividing both sides of the equation by 2, we get:\nx = 12 / 2\n\nSimplifying the right side of the equation, we get:\nx = 6\n\nTherefore, each book cost $6.\n\nAnswer: 6"
}
]
}
]
```
### Citation
If you find this work helpful, please cite our paper as follows:
```
@article{xie2023ask,
title={Ask Again, Then Fail: Large Language Models' Vacillations in Judgement},
author={Xie, Qiming and Wang, Zengzhi and Feng, Yi and Xia, Rui},
journal={arXiv preprint arXiv:2310.02174},
year={2023}
}
``` |
nicholasKluge/Pt-Corpus-Instruct | ---
dataset_info:
features:
- name: text
dtype: string
- name: metadata
dtype: string
splits:
- name: train
num_bytes: 29708613896
num_examples: 10564643
download_size: 17036520990
dataset_size: 29708613896
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: other
task_categories:
- text-generation
language:
- pt
tags:
- portuguese
- language-modeling
pretty_name: Pt-Corpus Instruct
size_categories:
- 1M<n<10M
---
# Portuguese-Corpus Instruct
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://nkluge-correa.github.io/TeenyTinyLlama/
- **Repository:** https://github.com/Nkluge-correa/TeenyTinyLlama
- **Paper:** [TeenyTinyLlama: open-source tiny language models trained in Brazilian Portuguese](https://arxiv.org/abs/2401.16640)
- **Point of Contact:** [AIRES at PUCRS](mailto:nicholas@airespucrs.org)
### Dataset Summary
Portuguese-Corpus Instruct is a concatenation of several portions of Brazilian Portuguese datasets found in the [Hub](https://huggingface.co/datasets?task_categories=task_categories:text-generation&language=language:pt&sort=trending).
In a tokenized format, the dataset (uncompressed) weighs 80 GB and has approximately 6.2B tokens. This version of the corpus (Pt-Corpus-Instruct) includes several instances of conversational and general instructional data, allowing trained models to go through preference pre-training during their initial pre-training stage.
### Supported Tasks and Leaderboards
This dataset can be utilized for tasks involving language modeling.
### Languages
Portuguese.
## Dataset Structure
### Data Instances
The dataset consists of the following features:
- **text:** a string of text in Portuguese.
- **metadata:** the source where that string originated.
### Data Fields
```python
{
"text": "A inteligência artificial (de sigla: IA; do inglês: artificial intelligence, de sigla: AI) é um campo de estudo multidisciplinar que abrange varias áreas do conhecimento.",
"metadata": "source: https://huggingface.co/datasets/graelo/wikipedia"
}
```
### Data Splits
Available splits are `train`.
```python
from datasets import load_dataset
dataset = load_dataset("nicholasKluge/Pt-Corpus-Instruct", split='train')
# If you don't want to download the entire dataset, set streaming to `True`
dataset = load_dataset("nicholasKluge/Pt-Corpus-Instruct", split='train', streaming=True)
```
## Dataset Creation
### Curation Rationale
This dataset was developed as part of the [TeenyTinyLlama: open-source tiny language models trained in Brazilian Portuguese](https://arxiv.org/abs/2401.16640) paper. In this study, we document the development of open-foundation models tailored for use in low-resource settings, their limitations, and their benefits.
### Source Data
#### Initial Data Collection and Normalization
We utilized some of the filters used in Rae et al. ([2021](https://arxiv.org/abs/2112.11446)), besides using a [fine-tuned BERTimbau](https://huggingface.co/nicholasKluge/ToxicityModelPT) to exclude samples classified above a pre-defined toxicity threshold. Conversational samples were formatted using a double new line separator (`\n\n`).
#### Who are the source language producers?
All text samples are native to Portuguese or translated from other languages to Portuguese (slight contamination of other languages should also be expected).
### Annotations
#### Annotation process
Portuguese-Corpus is a concatenation of several portions of Brazilian Portuguese datasets found in the [Hub](https://huggingface.co/datasets?task_categories=task_categories:text-generation&language=language:pt&sort=trending). We utilized some of the filters used in Rae et al. ([2021](https://arxiv.org/abs/2112.11446)), besides using a [fine-tuned BERTimbau](https://huggingface.co/nicholasKluge/ToxicityModelPT) to exclude samples classified above a pre-defined toxicity threshold. Conversational samples were formatted using a double new line separator (`\n\n`).
#### Who are the annotators?
[Nicholas Kluge Corrêa](mailto:nicholas@airespucrs.org).
### Personal and Sensitive Information
This dataset, sourced from web scraping, may potentially contain personal and sensitive information, alongside offensive, toxic, and disturbing language.
## Considerations for Using the Data
### Social Impact of Dataset
The presence of personal and sensitive information within the dataset raises concerns about privacy and data protection, potentially leading to breaches of individuals' confidentiality and security. Furthermore, the inclusion of offensive, toxic, and disturbing language in the dataset poses risks of perpetuating harmful behaviors and attitudes, contributing to the normalization of hate speech and online toxicity. Therefore, careful handling and ethical considerations are essential to mitigate these potential social impacts and promote responsible dataset use.
### Discussion of Biases
The inclusion of offensive, toxic, and disturbing language in the dataset poses risks of perpetuating harmful behaviors and attitudes, contributing to the normalization of hate speech and online toxicity.
### Other Known Limitations
A significant portion of the data within the dataset has been translated using translation engines, potentially resulting in corrupted samples of both language and code. While useful for quickly converting text between languages, translation engines often struggle with accurately preserving the syntax, semantics, and context of programming languages. As a result, the translated code may contain errors, syntax inconsistencies, or even introduce vulnerabilities, rendering it unreliable or unusable for its intended purpose.
## Additional Information
### Dataset Curators
[Nicholas Kluge Corrêa](mailto:nicholas@airespucrs.org).
### Licensing Information
The following datasets (_only training splits are a part of the corpus_) and respective licenses form the Portuguese-Corpus:
- [Wikipedia](https://huggingface.co/datasets/graelo/wikipedia) (License: [CC BY-SA 3.0](https://creativecommons.org/licenses/by-sa/3.0/))
- [Instruct-PTBR](https://huggingface.co/datasets/cnmoro/Instruct-PTBR-ENUS-11M) (License: [LLAMA 2 Community License](https://ai.meta.com/llama/license/))
- [CulturaX](https://huggingface.co/datasets/uonlp/CulturaX) (License: [ODC-By](https://opendatacommons.org/licenses/by/1-0/), [cc0-1.0](https://huggingface.co/datasets/oscar-corpus/OSCAR-2301#licensing-information))
- [Gpt4all](https://huggingface.co/datasets/pablo-moreira/gpt4all-j-prompt-generations-pt) (License: [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0.html))
- [OSCAR](https://huggingface.co/datasets/eduagarcia/OSCAR-2301-pt_dedup) (License: [cc0-1.0](https://huggingface.co/datasets/oscar-corpus/OSCAR-2301#licensing-information))
- [CCc100](https://huggingface.co/datasets/eduagarcia/cc100-pt) (License: [Common Crawl terms of use](https://commoncrawl.org/terms-of-use/))
- [Bactrian-X](https://huggingface.co/datasets/MBZUAI/Bactrian-X) (License: [CC BY-NC 4.0](https://creativecommons.org/licenses/by-nc/4.0/deed.de))
- [Dolly-15k](https://huggingface.co/datasets/Gustrd/dolly-15k-libretranslate-pt) (License: [CC BY-SA 3.0](https://creativecommons.org/licenses/by-sa/3.0/))
- [CosmosQA](https://huggingface.co/datasets/heloisy/cosmos_qa_ptbr) (License: [CC BY 4.0](https://creativecommons.org/licenses/by/4.0/deed.de))
- [Roots Wikiquote](https://huggingface.co/datasets/bigscience-data/roots_pt_wikiquote) (License: [CC BY-SA 3.0](https://creativecommons.org/licenses/by-sa/3.0/))
- [Roots Ted Talks](https://huggingface.co/datasets/bigscience-data/roots_pt_ted_talks_iwslt) (License: [CC BY-NC-ND 4.0](https://creativecommons.org/licenses/by-nc-nd/4.0/deed.en))
### Citation Information
```latex
@misc{correa24ttllama,
title = {TeenyTinyLlama: open-source tiny language models trained in Brazilian Portuguese},
author = {Corr{\^e}a, Nicholas Kluge and Falk, Sophia and Fatimah, Shiza and Sen, Aniket and De Oliveira, Nythamar},
journal={arXiv preprint arXiv:2401.16640},
year={2024}
}
```
### Contributions
If you would like to contribute, contact me at [nicholas@airespucrs.org](mailto:nicholas@airespucrs.org)!
|
Basilisk181297/Cars_I_like | ---
license: apache-2.0
task_categories:
- image-classification
- image-to-text
- depth-estimation
language:
- en
tags:
- cars
- mercedes
- jpg
pretty_name: My Favorite Cars
size_categories:
- n<1K
--- |
Sntng/drone_view_augment | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 569863871.99
num_examples: 1035
- name: validation
num_bytes: 27419735.0
num_examples: 49
download_size: 94602473
dataset_size: 597283606.99
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Mistral-7b | ---
pretty_name: Evaluation run of indischepartij/OpenMia-Indo-Mistral-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [indischepartij/OpenMia-Indo-Mistral-7b](https://huggingface.co/indischepartij/OpenMia-Indo-Mistral-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Mistral-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T21:17:16.535134](https://huggingface.co/datasets/open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Mistral-7b/blob/main/results_2024-02-02T21-17-16.535134.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6242826022949111,\n\
\ \"acc_stderr\": 0.032680414364733124,\n \"acc_norm\": 0.6304091726284682,\n\
\ \"acc_norm_stderr\": 0.0333471128489704,\n \"mc1\": 0.31211750305997554,\n\
\ \"mc1_stderr\": 0.016220756769520922,\n \"mc2\": 0.4525653160173771,\n\
\ \"mc2_stderr\": 0.014454991919014401\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5614334470989761,\n \"acc_stderr\": 0.014500682618212864,\n\
\ \"acc_norm\": 0.5964163822525598,\n \"acc_norm_stderr\": 0.014337158914268447\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6299541923919538,\n\
\ \"acc_stderr\": 0.004818298991012548,\n \"acc_norm\": 0.8318064130651265,\n\
\ \"acc_norm_stderr\": 0.00373273677042972\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926604,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926604\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800893,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800893\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.049020713000019756,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.049020713000019756\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246483,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246483\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7516129032258064,\n\
\ \"acc_stderr\": 0.024580028921481006,\n \"acc_norm\": 0.7516129032258064,\n\
\ \"acc_norm_stderr\": 0.024580028921481006\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.03095405547036592,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.03095405547036592\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593542,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593542\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131147,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131147\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515002,\n \
\ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515002\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8091743119266055,\n \"acc_stderr\": 0.01684767640009109,\n \"\
acc_norm\": 0.8091743119266055,\n \"acc_norm_stderr\": 0.01684767640009109\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.033981108902946366,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.033981108902946366\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7843137254901961,\n \"acc_stderr\": 0.02886743144984932,\n \"\
acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.02886743144984932\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455005,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455005\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615624,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615624\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077833,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077833\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8045977011494253,\n\
\ \"acc_stderr\": 0.014179171373424384,\n \"acc_norm\": 0.8045977011494253,\n\
\ \"acc_norm_stderr\": 0.014179171373424384\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.025009313790069723,\n\
\ \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.025009313790069723\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27932960893854747,\n\
\ \"acc_stderr\": 0.015005762446786166,\n \"acc_norm\": 0.27932960893854747,\n\
\ \"acc_norm_stderr\": 0.015005762446786166\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826517,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826517\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.026003301117885142,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.026003301117885142\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.025630824975621348,\n\
\ \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.025630824975621348\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43415906127770537,\n\
\ \"acc_stderr\": 0.012659033237067248,\n \"acc_norm\": 0.43415906127770537,\n\
\ \"acc_norm_stderr\": 0.012659033237067248\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6486928104575164,\n \"acc_stderr\": 0.019312676065786558,\n \
\ \"acc_norm\": 0.6486928104575164,\n \"acc_norm_stderr\": 0.019312676065786558\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.02927956741106568,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.02927956741106568\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774711,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774711\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368043,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368043\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31211750305997554,\n\
\ \"mc1_stderr\": 0.016220756769520922,\n \"mc2\": 0.4525653160173771,\n\
\ \"mc2_stderr\": 0.014454991919014401\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7782162588792423,\n \"acc_stderr\": 0.011676109244497811\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3457164518574678,\n \
\ \"acc_stderr\": 0.013100422990441582\n }\n}\n```"
repo_url: https://huggingface.co/indischepartij/OpenMia-Indo-Mistral-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|arc:challenge|25_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|gsm8k|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hellaswag|10_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T21-17-16.535134.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T21-17-16.535134.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- '**/details_harness|winogrande|5_2024-02-02T21-17-16.535134.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T21-17-16.535134.parquet'
- config_name: results
data_files:
- split: 2024_02_02T21_17_16.535134
path:
- results_2024-02-02T21-17-16.535134.parquet
- split: latest
path:
- results_2024-02-02T21-17-16.535134.parquet
---
# Dataset Card for Evaluation run of indischepartij/OpenMia-Indo-Mistral-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [indischepartij/OpenMia-Indo-Mistral-7b](https://huggingface.co/indischepartij/OpenMia-Indo-Mistral-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Mistral-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T21:17:16.535134](https://huggingface.co/datasets/open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Mistral-7b/blob/main/results_2024-02-02T21-17-16.535134.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6242826022949111,
"acc_stderr": 0.032680414364733124,
"acc_norm": 0.6304091726284682,
"acc_norm_stderr": 0.0333471128489704,
"mc1": 0.31211750305997554,
"mc1_stderr": 0.016220756769520922,
"mc2": 0.4525653160173771,
"mc2_stderr": 0.014454991919014401
},
"harness|arc:challenge|25": {
"acc": 0.5614334470989761,
"acc_stderr": 0.014500682618212864,
"acc_norm": 0.5964163822525598,
"acc_norm_stderr": 0.014337158914268447
},
"harness|hellaswag|10": {
"acc": 0.6299541923919538,
"acc_stderr": 0.004818298991012548,
"acc_norm": 0.8318064130651265,
"acc_norm_stderr": 0.00373273677042972
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926604,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800893,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800893
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.049020713000019756,
"acc_norm": 0.39,
"acc_norm_stderr": 0.049020713000019756
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246483,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246483
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7516129032258064,
"acc_stderr": 0.024580028921481006,
"acc_norm": 0.7516129032258064,
"acc_norm_stderr": 0.024580028921481006
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.03095405547036592,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.03095405547036592
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593542,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593542
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131147,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131147
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8091743119266055,
"acc_stderr": 0.01684767640009109,
"acc_norm": 0.8091743119266055,
"acc_norm_stderr": 0.01684767640009109
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.033981108902946366,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.033981108902946366
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.02886743144984932,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.02886743144984932
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455005,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455005
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615624,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077833,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077833
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8045977011494253,
"acc_stderr": 0.014179171373424384,
"acc_norm": 0.8045977011494253,
"acc_norm_stderr": 0.014179171373424384
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.025009313790069723,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.025009313790069723
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27932960893854747,
"acc_stderr": 0.015005762446786166,
"acc_norm": 0.27932960893854747,
"acc_norm_stderr": 0.015005762446786166
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826517,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826517
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885142,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885142
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.025630824975621348,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.025630824975621348
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43415906127770537,
"acc_stderr": 0.012659033237067248,
"acc_norm": 0.43415906127770537,
"acc_norm_stderr": 0.012659033237067248
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6486928104575164,
"acc_stderr": 0.019312676065786558,
"acc_norm": 0.6486928104575164,
"acc_norm_stderr": 0.019312676065786558
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.02927956741106568,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.02927956741106568
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857833,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857833
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774711,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774711
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368043,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368043
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31211750305997554,
"mc1_stderr": 0.016220756769520922,
"mc2": 0.4525653160173771,
"mc2_stderr": 0.014454991919014401
},
"harness|winogrande|5": {
"acc": 0.7782162588792423,
"acc_stderr": 0.011676109244497811
},
"harness|gsm8k|5": {
"acc": 0.3457164518574678,
"acc_stderr": 0.013100422990441582
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
germank/hh-rlhf_with_features_flan_t5_large-no_eos | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: helpfulness_chosen
dtype: int64
- name: helpfulness_rejected
dtype: int64
- name: specificity_chosen
dtype: int64
- name: specificity_rejected
dtype: int64
- name: intent_chosen
dtype: int64
- name: intent_rejected
dtype: int64
- name: factuality_chosen
dtype: int64
- name: factuality_rejected
dtype: int64
- name: easy-to-understand_chosen
dtype: int64
- name: easy-to-understand_rejected
dtype: int64
- name: relevance_chosen
dtype: int64
- name: relevance_rejected
dtype: int64
- name: readability_chosen
dtype: int64
- name: readability_rejected
dtype: int64
- name: enough-detail_chosen
dtype: int64
- name: enough-detail_rejected
dtype: int64
- name: biased:_chosen
dtype: int64
- name: biased:_rejected
dtype: int64
- name: fail-to-consider-individual-preferences_chosen
dtype: int64
- name: fail-to-consider-individual-preferences_rejected
dtype: int64
- name: repetetive_chosen
dtype: int64
- name: repetetive_rejected
dtype: int64
- name: fail-to-consider-context_chosen
dtype: int64
- name: fail-to-consider-context_rejected
dtype: int64
- name: too-long_chosen
dtype: int64
- name: too-long_rejected
dtype: int64
- name: human
dtype: string
- name: assistant_chosen
dtype: string
- name: assistant_rejected
dtype: string
- name: log_score_chosen
dtype: float64
- name: log_score_rejected
dtype: float64
- name: labels
dtype: string
splits:
- name: train
num_bytes: 14434424
num_examples: 9574
- name: test
num_bytes: 14378349
num_examples: 9574
download_size: 15748459
dataset_size: 28812773
---
# Dataset Card for "hh-rlhf_with_features_flan_t5_large"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vdaita/commitpack-ft-sr | ---
dataset_info:
features:
- name: filepath
dtype: string
- name: file_contents
dtype: string
- name: code_change_inst
dtype: string
- name: code_change
dtype: string
splits:
- name: train
num_bytes: 506214
num_examples: 210
download_size: 218377
dataset_size: 506214
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
meerkat-ml/lemma | ---
license: apache-2.0
---
|
Lewan/Zari_Duo_English | ---
license: cc
---
|
CarrieXu/My_Dataset | ---
license: other
---
image dataset |
thobauma/harmless-poisoned-0.005-dollar-murder | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 58402939.44335993
num_examples: 42537
download_size: 31364075
dataset_size: 58402939.44335993
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dkshjn/test | ---
dataset_info:
- config_name: s1
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: embeddings_input
sequence: float64
- name: embeddings_output
sequence: float64
splits:
- name: train
num_bytes: 592948658
num_examples: 82964
download_size: 488884217
dataset_size: 592948658
- config_name: s2
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: embeddings_input
dtype: string
- name: embeddings_output
dtype: string
splits:
- name: train
num_bytes: 83220956
num_examples: 82963
download_size: 46390787
dataset_size: 83220956
- config_name: s3
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: embeddings_input
dtype: string
- name: embeddings_output
dtype: string
splits:
- name: train
num_bytes: 83283447
num_examples: 82963
download_size: 46421395
dataset_size: 83283447
- config_name: s4
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: embeddings_input
dtype: string
- name: embeddings_output
dtype: string
splits:
- name: train
num_bytes: 83131960
num_examples: 82963
download_size: 46333736
dataset_size: 83131960
- config_name: s5
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: embeddings_input
dtype: string
- name: embeddings_output
dtype: string
splits:
- name: train
num_bytes: 83324693
num_examples: 82963
download_size: 46468556
dataset_size: 83324693
configs:
- config_name: s1
data_files:
- split: train
path: s1/train-*
- config_name: s2
data_files:
- split: train
path: s2/train-*
- config_name: s3
data_files:
- split: train
path: s3/train-*
- config_name: s4
data_files:
- split: train
path: s4/train-*
- config_name: s5
data_files:
- split: train
path: s5/train-*
---
|
HausaNLP/afrisenti-lid-data | ---
license: cc-by-nc-sa-4.0
---
|
open-llm-leaderboard/details_bunnycore__SmartToxic-7B | ---
pretty_name: Evaluation run of bunnycore/SmartToxic-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bunnycore/SmartToxic-7B](https://huggingface.co/bunnycore/SmartToxic-7B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bunnycore__SmartToxic-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T16:45:00.739078](https://huggingface.co/datasets/open-llm-leaderboard/details_bunnycore__SmartToxic-7B/blob/main/results_2024-04-15T16-45-00.739078.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6569211781955342,\n\
\ \"acc_stderr\": 0.03203242706274925,\n \"acc_norm\": 0.6564714845320724,\n\
\ \"acc_norm_stderr\": 0.0327026302811046,\n \"mc1\": 0.5789473684210527,\n\
\ \"mc1_stderr\": 0.017283936248136476,\n \"mc2\": 0.7266920949992095,\n\
\ \"mc2_stderr\": 0.014647738443331712\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7030716723549488,\n \"acc_stderr\": 0.013352025976725225,\n\
\ \"acc_norm\": 0.7286689419795221,\n \"acc_norm_stderr\": 0.012993807727545797\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7162915753833897,\n\
\ \"acc_stderr\": 0.0044987571944934005,\n \"acc_norm\": 0.8879705238000398,\n\
\ \"acc_norm_stderr\": 0.003147581209374547\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.041539484047423976,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.041539484047423976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.028049186315695255,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.028049186315695255\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n\
\ \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.7916666666666666,\n\
\ \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.02328766512726854,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.02328766512726854\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083008,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083008\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931796,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931796\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8227848101265823,\n \"acc_stderr\": 0.024856364184503224,\n \
\ \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.024856364184503224\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368985,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368985\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044287,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044287\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42793296089385474,\n\
\ \"acc_stderr\": 0.01654788799741611,\n \"acc_norm\": 0.42793296089385474,\n\
\ \"acc_norm_stderr\": 0.01654788799741611\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712992,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712992\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n\
\ \"acc_stderr\": 0.012745204626083138,\n \"acc_norm\": 0.46870925684485004,\n\
\ \"acc_norm_stderr\": 0.012745204626083138\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396553,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396553\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6862745098039216,\n \"acc_stderr\": 0.018771683893528176,\n \
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.018771683893528176\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5789473684210527,\n\
\ \"mc1_stderr\": 0.017283936248136476,\n \"mc2\": 0.7266920949992095,\n\
\ \"mc2_stderr\": 0.014647738443331712\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8492501973164956,\n \"acc_stderr\": 0.010056094631479696\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6884003032600455,\n \
\ \"acc_stderr\": 0.012757375376754945\n }\n}\n```"
repo_url: https://huggingface.co/bunnycore/SmartToxic-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|arc:challenge|25_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|gsm8k|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hellaswag|10_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T16-45-00.739078.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T16-45-00.739078.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- '**/details_harness|winogrande|5_2024-04-15T16-45-00.739078.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T16-45-00.739078.parquet'
- config_name: results
data_files:
- split: 2024_04_15T16_45_00.739078
path:
- results_2024-04-15T16-45-00.739078.parquet
- split: latest
path:
- results_2024-04-15T16-45-00.739078.parquet
---
# Dataset Card for Evaluation run of bunnycore/SmartToxic-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [bunnycore/SmartToxic-7B](https://huggingface.co/bunnycore/SmartToxic-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bunnycore__SmartToxic-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T16:45:00.739078](https://huggingface.co/datasets/open-llm-leaderboard/details_bunnycore__SmartToxic-7B/blob/main/results_2024-04-15T16-45-00.739078.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6569211781955342,
"acc_stderr": 0.03203242706274925,
"acc_norm": 0.6564714845320724,
"acc_norm_stderr": 0.0327026302811046,
"mc1": 0.5789473684210527,
"mc1_stderr": 0.017283936248136476,
"mc2": 0.7266920949992095,
"mc2_stderr": 0.014647738443331712
},
"harness|arc:challenge|25": {
"acc": 0.7030716723549488,
"acc_stderr": 0.013352025976725225,
"acc_norm": 0.7286689419795221,
"acc_norm_stderr": 0.012993807727545797
},
"harness|hellaswag|10": {
"acc": 0.7162915753833897,
"acc_stderr": 0.0044987571944934005,
"acc_norm": 0.8879705238000398,
"acc_norm_stderr": 0.003147581209374547
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.041539484047423976,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.041539484047423976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.028049186315695255,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.028049186315695255
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7916666666666666,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.7916666666666666,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726854,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726854
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083008,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083008
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.034086558679777494,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.034086558679777494
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931796,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931796
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.024856364184503224,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.024856364184503224
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368985,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368985
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044287,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42793296089385474,
"acc_stderr": 0.01654788799741611,
"acc_norm": 0.42793296089385474,
"acc_norm_stderr": 0.01654788799741611
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.0256468630971379,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.0256468630971379
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712992,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712992
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46870925684485004,
"acc_stderr": 0.012745204626083138,
"acc_norm": 0.46870925684485004,
"acc_norm_stderr": 0.012745204626083138
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396553,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396553
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.018771683893528176,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.018771683893528176
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5789473684210527,
"mc1_stderr": 0.017283936248136476,
"mc2": 0.7266920949992095,
"mc2_stderr": 0.014647738443331712
},
"harness|winogrande|5": {
"acc": 0.8492501973164956,
"acc_stderr": 0.010056094631479696
},
"harness|gsm8k|5": {
"acc": 0.6884003032600455,
"acc_stderr": 0.012757375376754945
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
heliosprime/twitter_dataset_1713075844 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 16104
num_examples: 34
download_size: 10853
dataset_size: 16104
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713075844"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mteb/fever | ---
language:
- en
multilinguality:
- monolingual
task_categories:
- text-retrieval
source_datasets:
- fever
task_ids:
- document-retrieval
config_names:
- corpus
tags:
- text-retrieval
dataset_info:
- config_name: default
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: float64
splits:
- name: train
num_bytes: 5121231
num_examples: 140085
- name: dev
num_bytes: 296284
num_examples: 8079
- name: test
num_bytes: 297743
num_examples: 7937
- config_name: corpus
features:
- name: _id
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: corpus
num_bytes: 3095105800
num_examples: 5416568
- config_name: queries
features:
- name: _id
dtype: string
- name: text
dtype: string
splits:
- name: queries
num_bytes: 7530379
num_examples: 123142
configs:
- config_name: default
data_files:
- split: train
path: qrels/train.jsonl
- split: dev
path: qrels/dev.jsonl
- split: test
path: qrels/test.jsonl
- config_name: corpus
data_files:
- split: corpus
path: corpus.jsonl
- config_name: queries
data_files:
- split: queries
path: queries.jsonl
--- |
Izazk/izaz-mind2web-dataset | ---
task_categories:
- text-generation
- text-classification
- feature-extraction
tags:
- code
- biology
- webdataset
- climate
- finance
- legal
pretty_name: Mind2webAction
size_categories:
- 10K<n<100K
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
bigbio/mayosrs |
---
language:
- en
bigbio_language:
- English
license: cc0-1.0
multilinguality: monolingual
bigbio_license_shortname: CC0_1p0
pretty_name: MayoSRS
homepage: https://conservancy.umn.edu/handle/11299/196265
bigbio_pubmed: False
bigbio_public: True
bigbio_tasks:
- SEMANTIC_SIMILARITY
---
# Dataset Card for MayoSRS
## Dataset Description
- **Homepage:** https://conservancy.umn.edu/handle/11299/196265
- **Pubmed:** False
- **Public:** True
- **Tasks:** STS
MayoSRS consists of 101 clinical term pairs whose relatedness was determined by nine medical coders and three physicians from the Mayo Clinic.
## Citation Information
```
@article{pedersen2007measures,
title={Measures of semantic similarity and relatedness in the biomedical domain},
author={Pedersen, Ted and Pakhomov, Serguei VS and Patwardhan, Siddharth and Chute, Christopher G},
journal={Journal of biomedical informatics},
volume={40},
number={3},
pages={288--299},
year={2007},
publisher={Elsevier}
}
```
|
Aur0ra/test | ---
task_categories:
- text-classification
size_categories:
- n<1K
---
Hello World |
jizzu/llama2_indian_law_v3 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 13349235
num_examples: 24607
download_size: 5933837
dataset_size: 13349235
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yzhuang/autotree_automl_eye_movements_gosdt_l512_d3_sd1 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 10863200000
num_examples: 100000
- name: validation
num_bytes: 1086320000
num_examples: 10000
download_size: 2726682247
dataset_size: 11949520000
---
# Dataset Card for "autotree_automl_eye_movements_gosdt_l512_d3_sd1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wmatejuk/maestro-small-sustain | ---
dataset_info:
features:
- name: notes
struct:
- name: duration
sequence: float64
- name: end
sequence: float64
- name: pitch
sequence: int64
- name: start
sequence: float64
- name: velocity
sequence: int64
- name: composer
dtype: string
- name: title
dtype: string
- name: year
dtype: int64
- name: midi_filename
dtype: string
splits:
- name: train
num_bytes: 937037
num_examples: 5
download_size: 381157
dataset_size: 937037
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/lee_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of lee_arknights
This is the dataset of lee_arknights, containing 58 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 58 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 134 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 58 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 58 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 58 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 58 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 58 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 134 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 134 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 134 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
maghwa/OpenHermes-2-AR-10K-49-930k-940k | ---
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: 'null'
- name: topic
dtype: 'null'
- name: hash
sequence: int64
- name: model
dtype: string
- name: idx
dtype: 'null'
- name: title
dtype: string
- name: avatarUrl
dtype: string
- name: conversations
dtype: string
- name: model_name
dtype: 'null'
- name: source
dtype: string
- name: skip_prompt_formatting
dtype: 'null'
- name: language
dtype: 'null'
- name: custom_instruction
dtype: 'null'
- name: category
dtype: 'null'
- name: views
dtype: float64
splits:
- name: train
num_bytes: 27746080
num_examples: 10001
download_size: 12059873
dataset_size: 27746080
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
El-chapoo/Summary-wikihow | ---
dataset_info:
features:
- name: summary
dtype: string
splits:
- name: train
num_bytes: 20091856
num_examples: 35775
- name: validation
num_bytes: 1124715
num_examples: 2000
- name: test
num_bytes: 1116104
num_examples: 2000
download_size: 13513434
dataset_size: 22332675
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
xiongfei/cake-dataset | ---
license: bigscience-openrail-m
---
|
Yagorka/pokemon-with-name-and-blip-caption | ---
license: openrail
task_categories:
- text-to-image
language:
- ru
- en
pretty_name: pokemons_with_prompt
size_categories:
- 1K<n<10K
--- |
gayanin/kaggle-native-v8-noised-test | ---
dataset_info:
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 561108
num_examples: 5140
- name: test
num_bytes: 70367
num_examples: 643
- name: validation
num_bytes: 71373
num_examples: 643
download_size: 315761
dataset_size: 702848
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
open-llm-leaderboard/details_Yash21__SuperChat-7B | ---
pretty_name: Evaluation run of Yash21/SuperChat-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Yash21/SuperChat-7B](https://huggingface.co/Yash21/SuperChat-7B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yash21__SuperChat-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-10T20:23:24.500362](https://huggingface.co/datasets/open-llm-leaderboard/details_Yash21__SuperChat-7B/blob/main/results_2024-01-10T20-23-24.500362.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2329591600032144,\n\
\ \"acc_stderr\": 0.02994880387912839,\n \"acc_norm\": 0.23306102803417483,\n\
\ \"acc_norm_stderr\": 0.030696738126112497,\n \"mc1\": 0.22766217870257038,\n\
\ \"mc1_stderr\": 0.014679255032111068,\n \"mc2\": 0.47206376560053864,\n\
\ \"mc2_stderr\": 0.01641409978357997\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.19112627986348124,\n \"acc_stderr\": 0.011490055292778596,\n\
\ \"acc_norm\": 0.23976109215017063,\n \"acc_norm_stderr\": 0.012476304127453947\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.26070503883688506,\n\
\ \"acc_stderr\": 0.004381220409641168,\n \"acc_norm\": 0.26399123680541725,\n\
\ \"acc_norm_stderr\": 0.0043989372250384145\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.17777777777777778,\n\
\ \"acc_stderr\": 0.033027898599017176,\n \"acc_norm\": 0.17777777777777778,\n\
\ \"acc_norm_stderr\": 0.033027898599017176\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774707,\n \
\ \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.03684529491774707\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n\
\ \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.23,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.22127659574468084,\n \"acc_stderr\": 0.027136349602424063,\n\
\ \"acc_norm\": 0.22127659574468084,\n \"acc_norm_stderr\": 0.027136349602424063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135303,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135303\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708617,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708617\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.042163702135578345,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.042163702135578345\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1870967741935484,\n \"acc_stderr\": 0.02218571009225226,\n \"\
acc_norm\": 0.1870967741935484,\n \"acc_norm_stderr\": 0.02218571009225226\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.23645320197044334,\n \"acc_stderr\": 0.029896114291733552,\n \"\
acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.029896114291733552\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3484848484848485,\n \"acc_stderr\": 0.033948539651564025,\n \"\
acc_norm\": 0.3484848484848485,\n \"acc_norm_stderr\": 0.033948539651564025\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860674,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860674\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2153846153846154,\n \"acc_stderr\": 0.020843034557462878,\n\
\ \"acc_norm\": 0.2153846153846154,\n \"acc_norm_stderr\": 0.020843034557462878\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2222222222222222,\n \"acc_stderr\": 0.02534809746809783,\n \
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02534809746809783\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.02702543349888239,\n\
\ \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.02702543349888239\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1908256880733945,\n \"acc_stderr\": 0.016847676400091095,\n \"\
acc_norm\": 0.1908256880733945,\n \"acc_norm_stderr\": 0.016847676400091095\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3183856502242152,\n\
\ \"acc_stderr\": 0.03126580522513714,\n \"acc_norm\": 0.3183856502242152,\n\
\ \"acc_norm_stderr\": 0.03126580522513714\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n\
\ \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n\
\ \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.03259177392742177,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.03259177392742177\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841043,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841043\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26053639846743293,\n\
\ \"acc_stderr\": 0.015696008563807096,\n \"acc_norm\": 0.26053639846743293,\n\
\ \"acc_norm_stderr\": 0.015696008563807096\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.023083658586984204,\n\
\ \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.023083658586984204\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.28104575163398693,\n \"acc_stderr\": 0.025738854797818737,\n\
\ \"acc_norm\": 0.28104575163398693,\n \"acc_norm_stderr\": 0.025738854797818737\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.19935691318327975,\n\
\ \"acc_stderr\": 0.022691033780549656,\n \"acc_norm\": 0.19935691318327975,\n\
\ \"acc_norm_stderr\": 0.022691033780549656\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22839506172839505,\n \"acc_stderr\": 0.023358211840626267,\n\
\ \"acc_norm\": 0.22839506172839505,\n \"acc_norm_stderr\": 0.023358211840626267\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23049645390070922,\n \"acc_stderr\": 0.025123739226872405,\n \
\ \"acc_norm\": 0.23049645390070922,\n \"acc_norm_stderr\": 0.025123739226872405\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.04069306319721377,\n\
\ \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.04069306319721377\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.030360490154014652,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.030360490154014652\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.21052631578947367,\n\
\ \"acc_stderr\": 0.0312678171466318,\n \"acc_norm\": 0.21052631578947367,\n\
\ \"acc_norm_stderr\": 0.0312678171466318\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.22766217870257038,\n \"mc1_stderr\": 0.014679255032111068,\n\
\ \"mc2\": 0.47206376560053864,\n \"mc2_stderr\": 0.01641409978357997\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.5019731649565904,\n\
\ \"acc_stderr\": 0.014052376259225636\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.009855951478392721,\n \"acc_stderr\": 0.0027210765770416595\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Yash21/SuperChat-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|arc:challenge|25_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|gsm8k|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hellaswag|10_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T20-23-24.500362.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T20-23-24.500362.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- '**/details_harness|winogrande|5_2024-01-10T20-23-24.500362.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-10T20-23-24.500362.parquet'
- config_name: results
data_files:
- split: 2024_01_10T20_23_24.500362
path:
- results_2024-01-10T20-23-24.500362.parquet
- split: latest
path:
- results_2024-01-10T20-23-24.500362.parquet
---
# Dataset Card for Evaluation run of Yash21/SuperChat-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Yash21/SuperChat-7B](https://huggingface.co/Yash21/SuperChat-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Yash21__SuperChat-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-10T20:23:24.500362](https://huggingface.co/datasets/open-llm-leaderboard/details_Yash21__SuperChat-7B/blob/main/results_2024-01-10T20-23-24.500362.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2329591600032144,
"acc_stderr": 0.02994880387912839,
"acc_norm": 0.23306102803417483,
"acc_norm_stderr": 0.030696738126112497,
"mc1": 0.22766217870257038,
"mc1_stderr": 0.014679255032111068,
"mc2": 0.47206376560053864,
"mc2_stderr": 0.01641409978357997
},
"harness|arc:challenge|25": {
"acc": 0.19112627986348124,
"acc_stderr": 0.011490055292778596,
"acc_norm": 0.23976109215017063,
"acc_norm_stderr": 0.012476304127453947
},
"harness|hellaswag|10": {
"acc": 0.26070503883688506,
"acc_stderr": 0.004381220409641168,
"acc_norm": 0.26399123680541725,
"acc_norm_stderr": 0.0043989372250384145
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.17777777777777778,
"acc_stderr": 0.033027898599017176,
"acc_norm": 0.17777777777777778,
"acc_norm_stderr": 0.033027898599017176
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.16,
"acc_stderr": 0.03684529491774707,
"acc_norm": 0.16,
"acc_norm_stderr": 0.03684529491774707
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.22127659574468084,
"acc_stderr": 0.027136349602424063,
"acc_norm": 0.22127659574468084,
"acc_norm_stderr": 0.027136349602424063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135303,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135303
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708617,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708617
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.042163702135578345,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.042163702135578345
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1870967741935484,
"acc_stderr": 0.02218571009225226,
"acc_norm": 0.1870967741935484,
"acc_norm_stderr": 0.02218571009225226
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.029896114291733552,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.029896114291733552
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3484848484848485,
"acc_stderr": 0.033948539651564025,
"acc_norm": 0.3484848484848485,
"acc_norm_stderr": 0.033948539651564025
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860674,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2153846153846154,
"acc_stderr": 0.020843034557462878,
"acc_norm": 0.2153846153846154,
"acc_norm_stderr": 0.020843034557462878
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.02534809746809783,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.02534809746809783
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.22268907563025211,
"acc_stderr": 0.02702543349888239,
"acc_norm": 0.22268907563025211,
"acc_norm_stderr": 0.02702543349888239
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1908256880733945,
"acc_stderr": 0.016847676400091095,
"acc_norm": 0.1908256880733945,
"acc_norm_stderr": 0.016847676400091095
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3183856502242152,
"acc_stderr": 0.03126580522513714,
"acc_norm": 0.3183856502242152,
"acc_norm_stderr": 0.03126580522513714
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.03259177392742177,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.03259177392742177
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841043,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841043
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26053639846743293,
"acc_stderr": 0.015696008563807096,
"acc_norm": 0.26053639846743293,
"acc_norm_stderr": 0.015696008563807096
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.28104575163398693,
"acc_stderr": 0.025738854797818737,
"acc_norm": 0.28104575163398693,
"acc_norm_stderr": 0.025738854797818737
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.19935691318327975,
"acc_stderr": 0.022691033780549656,
"acc_norm": 0.19935691318327975,
"acc_norm_stderr": 0.022691033780549656
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22839506172839505,
"acc_stderr": 0.023358211840626267,
"acc_norm": 0.22839506172839505,
"acc_norm_stderr": 0.023358211840626267
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23049645390070922,
"acc_stderr": 0.025123739226872405,
"acc_norm": 0.23049645390070922,
"acc_norm_stderr": 0.025123739226872405
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.04069306319721377,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.04069306319721377
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014652,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014652
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22766217870257038,
"mc1_stderr": 0.014679255032111068,
"mc2": 0.47206376560053864,
"mc2_stderr": 0.01641409978357997
},
"harness|winogrande|5": {
"acc": 0.5019731649565904,
"acc_stderr": 0.014052376259225636
},
"harness|gsm8k|5": {
"acc": 0.009855951478392721,
"acc_stderr": 0.0027210765770416595
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
umd-zhou-lab/recycled_wiz70_v1 | ---
dataset_info:
features:
- name: data
struct:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 176446017
num_examples: 70000
download_size: 90580506
dataset_size: 176446017
---
# Dataset Card for "recycled_wiz70_v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CaioConteudos/ultra | ---
license: openrail
---
|
kyujinpy/Ko-various-dataset | ---
license: cc-by-nc-sa-4.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 57552968
num_examples: 38174
download_size: 29047684
dataset_size: 57552968
---
# 🌈Ko-various-dataset
- [kyujinpy/KOR-OpenOrca-Platypus-v3](https://huggingface.co/datasets/kyujinpy/KOR-OpenOrca-Platypus-v3) 포함.
- 추가적으로, [skt/kobest_v1](https://huggingface.co/datasets/skt/kobest_v1) 데이터셋 중 `COPA`와 `Hellaswag`를 [adaptLLM](https://huggingface.co/AdaptLLM)의 논문을 참고하여서 instruction-output dataset으로 만들어서 추가함.
- 데이터셋 이용하셔서 모델이나 데이터셋을 만드실 때, 간단한 출처 표기를 해주신다면 연구에 큰 도움이 됩니다😭😭
# 전처리
```
# Make the special text lists, manually.
[\n\t-=+,#/\$?:^$.@*\"–∼①②③④⑤ⓐⓑⓒ㉮㉯㉰㈜®...(중략)...∂Σ∩∅φμσℝλΛ≥℃∉⊂θ±€Øπ√≠≤ε∈∫ωηαβ÷≈ס̊°²/]
```
- 위의 정규표현식을 이용하여, 한국어 및 영어를 제외한 다양한 외국어, 이모지, 특수 문자 등등 제거.
- Output 답변이 너무 짧은 경우 제거.
- 번역 task 최대한 제거. (~번역 task는 한국어로 번역하면 거의 100% 오류)
|
jlbaker361/tinydata | ---
dataset_info:
features:
- name: label
dtype: int64
- name: sequence
sequence: int64
- name: occurence
dtype: int64
- name: split
dtype: string
splits:
- name: train
num_bytes: 13421
num_examples: 10
download_size: 5408
dataset_size: 13421
---
# Dataset Card for "tinydata"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_jisukim8873__mistralai-case-0-0 | ---
pretty_name: Evaluation run of jisukim8873/mistralai-case-0-0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jisukim8873/mistralai-case-0-0](https://huggingface.co/jisukim8873/mistralai-case-0-0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jisukim8873__mistralai-case-0-0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-22T05:22:48.916659](https://huggingface.co/datasets/open-llm-leaderboard/details_jisukim8873__mistralai-case-0-0/blob/main/results_2024-03-22T05-22-48.916659.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6268607092515852,\n\
\ \"acc_stderr\": 0.03248362504632636,\n \"acc_norm\": 0.6324287443541512,\n\
\ \"acc_norm_stderr\": 0.03314366006504236,\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.41818062101055176,\n\
\ \"mc2_stderr\": 0.01431731999903095\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5767918088737202,\n \"acc_stderr\": 0.014438036220848029,\n\
\ \"acc_norm\": 0.6040955631399317,\n \"acc_norm_stderr\": 0.01429122839353659\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6279625572595101,\n\
\ \"acc_stderr\": 0.004823604775015904,\n \"acc_norm\": 0.8308105954989046,\n\
\ \"acc_norm_stderr\": 0.003741528956315842\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.042320736951515885,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.042320736951515885\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395268,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395268\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n\
\ \"acc_stderr\": 0.03692820767264866,\n \"acc_norm\": 0.6242774566473989,\n\
\ \"acc_norm_stderr\": 0.03692820767264866\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3941798941798942,\n \"acc_stderr\": 0.025167982333894143,\n \"\
acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.025167982333894143\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7387096774193549,\n \"acc_stderr\": 0.02499305339776483,\n \"\
acc_norm\": 0.7387096774193549,\n \"acc_norm_stderr\": 0.02499305339776483\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n \"\
acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.02649905770139746,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.02649905770139746\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.024396672985094767,\n\
\ \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.024396672985094767\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948492,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135353,\n\
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135353\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8110091743119267,\n \"acc_stderr\": 0.016785481159203627,\n \"\
acc_norm\": 0.8110091743119267,\n \"acc_norm_stderr\": 0.016785481159203627\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849316,\n\
\ \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849316\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159263,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159263\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8020434227330779,\n\
\ \"acc_stderr\": 0.01424887354921758,\n \"acc_norm\": 0.8020434227330779,\n\
\ \"acc_norm_stderr\": 0.01424887354921758\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.02475241196091721,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.02475241196091721\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.288268156424581,\n\
\ \"acc_stderr\": 0.01514913286020942,\n \"acc_norm\": 0.288268156424581,\n\
\ \"acc_norm_stderr\": 0.01514913286020942\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.02558306248998482,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.02558306248998482\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.02465968518596729,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.02465968518596729\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4452411994784876,\n\
\ \"acc_stderr\": 0.012693421303973294,\n \"acc_norm\": 0.4452411994784876,\n\
\ \"acc_norm_stderr\": 0.012693421303973294\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6360294117647058,\n \"acc_stderr\": 0.02922719246003203,\n\
\ \"acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.02922719246003203\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6519607843137255,\n \"acc_stderr\": 0.019270998708223977,\n \
\ \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.019270998708223977\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304324,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304324\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368036,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368036\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.41818062101055176,\n\
\ \"mc2_stderr\": 0.01431731999903095\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7868981846882399,\n \"acc_stderr\": 0.01150895769072276\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3684609552691433,\n \
\ \"acc_stderr\": 0.013287342651674569\n }\n}\n```"
repo_url: https://huggingface.co/jisukim8873/mistralai-case-0-0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|arc:challenge|25_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|gsm8k|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hellaswag|10_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T05-22-48.916659.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T05-22-48.916659.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- '**/details_harness|winogrande|5_2024-03-22T05-22-48.916659.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-22T05-22-48.916659.parquet'
- config_name: results
data_files:
- split: 2024_03_22T05_22_48.916659
path:
- results_2024-03-22T05-22-48.916659.parquet
- split: latest
path:
- results_2024-03-22T05-22-48.916659.parquet
---
# Dataset Card for Evaluation run of jisukim8873/mistralai-case-0-0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jisukim8873/mistralai-case-0-0](https://huggingface.co/jisukim8873/mistralai-case-0-0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jisukim8873__mistralai-case-0-0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-22T05:22:48.916659](https://huggingface.co/datasets/open-llm-leaderboard/details_jisukim8873__mistralai-case-0-0/blob/main/results_2024-03-22T05-22-48.916659.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6268607092515852,
"acc_stderr": 0.03248362504632636,
"acc_norm": 0.6324287443541512,
"acc_norm_stderr": 0.03314366006504236,
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.41818062101055176,
"mc2_stderr": 0.01431731999903095
},
"harness|arc:challenge|25": {
"acc": 0.5767918088737202,
"acc_stderr": 0.014438036220848029,
"acc_norm": 0.6040955631399317,
"acc_norm_stderr": 0.01429122839353659
},
"harness|hellaswag|10": {
"acc": 0.6279625572595101,
"acc_stderr": 0.004823604775015904,
"acc_norm": 0.8308105954989046,
"acc_norm_stderr": 0.003741528956315842
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.042320736951515885,
"acc_norm": 0.6,
"acc_norm_stderr": 0.042320736951515885
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395268,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395268
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.03692820767264866,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.03692820767264866
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.025167982333894143,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.025167982333894143
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7387096774193549,
"acc_stderr": 0.02499305339776483,
"acc_norm": 0.7387096774193549,
"acc_norm_stderr": 0.02499305339776483
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.02649905770139746,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.02649905770139746
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.024396672985094767,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.024396672985094767
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948492,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135353,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135353
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8110091743119267,
"acc_stderr": 0.016785481159203627,
"acc_norm": 0.8110091743119267,
"acc_norm_stderr": 0.016785481159203627
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849316,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849316
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159263,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159263
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8020434227330779,
"acc_stderr": 0.01424887354921758,
"acc_norm": 0.8020434227330779,
"acc_norm_stderr": 0.01424887354921758
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.02475241196091721,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.02475241196091721
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.288268156424581,
"acc_stderr": 0.01514913286020942,
"acc_norm": 0.288268156424581,
"acc_norm_stderr": 0.01514913286020942
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998482,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998482
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.02465968518596729,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.02465968518596729
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4452411994784876,
"acc_stderr": 0.012693421303973294,
"acc_norm": 0.4452411994784876,
"acc_norm_stderr": 0.012693421303973294
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6360294117647058,
"acc_stderr": 0.02922719246003203,
"acc_norm": 0.6360294117647058,
"acc_norm_stderr": 0.02922719246003203
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.019270998708223977,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.019270998708223977
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304324,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304324
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368036,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368036
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.41818062101055176,
"mc2_stderr": 0.01431731999903095
},
"harness|winogrande|5": {
"acc": 0.7868981846882399,
"acc_stderr": 0.01150895769072276
},
"harness|gsm8k|5": {
"acc": 0.3684609552691433,
"acc_stderr": 0.013287342651674569
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
DataGuard/wizard-en | ---
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 335577067
num_examples: 143000
download_size: 160896543
dataset_size: 335577067
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "wizard-en"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/squad_qa_wrong_title_v5_full_last_permute | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: correct_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 7855838.683639287
num_examples: 4778
- name: validation
num_bytes: 361864
num_examples: 300
download_size: 1363399
dataset_size: 8217702.683639287
---
# Dataset Card for "squad_qa_wrong_title_v5_full_last_permute"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/med_alpaca_standardized_cluster_48_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 5213941
num_examples: 16650
download_size: 2036389
dataset_size: 5213941
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_48_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
angdong/nate-news-economy | ---
license: mit
---
|
ed-00/SQUAD-Extractive-Instruction | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: instruction
dtype: string
- name: output
dtype: string
- name: template
dtype: string
splits:
- name: train
num_bytes: 443339186
num_examples: 217918
download_size: 74271026
dataset_size: 443339186
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
Squad 1 & 2 together with instruction for Extractive Instruction tuning.
|
Locutusque/dibt-instruct | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 6743305
num_examples: 3340
download_size: 3803007
dataset_size: 6743305
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: other
task_categories:
- text-generation
- text2text-generation
- fill-mask
language:
- en
size_categories:
- 1K<n<10K
---
# Dataset Card for dibt-instruct
`Locutusque/dibt-instruct` is a dataset derived from the `10k_prompts_ranked` dataset, where an answer has been generated for each prompt using Google's Gemini Pro language model. Before augmenting, 5,000 prompts were sampled from the original 10,000 prompts, and those with a quality score less than or equal to 3.5 were removed, resulting in 3,340 prompt-answer pairs.
## Dataset Details
- **Curated by:** Derived from the `10k_prompts_ranked` dataset created by Argilla, Hugging Face, and the Prompt Collective community.
- **Language:** English
- **License:** Inherited from `10k_prompts_ranked` dataset [More Information Needed]
## Dataset Description
This augmented dataset contains 3,340 examples, each consisting of a prompt from the original `10k_prompts_ranked` dataset and a generated answer using Google's Gemini Pro language model.
The prompts were filtered to only include those with an average quality rating greater than 3.5 out of 5 in the original dataset.
## Dataset Creation
### Source Data
The source data is the `10k_prompts_ranked` dataset, which contains 10,331 prompts with quality rankings from 314 community members.
### Data Augmentation
1. 5,000 prompts were randomly sampled from the `10k_prompts_ranked` dataset.
2. Prompts with an average quality score <= 3.5 were removed, leaving 3,340 prompts.
3. For each remaining prompt, an answer was generated using Google's Gemini Pro language model.
4. The generated answers were combined with the corresponding prompts to create the augmented dataset.
## Dataset Structure
Each example in the augmented dataset is a dictionary with the following keys:
- `question`: The original prompt text from `10k_prompts_ranked`.
- `answer`: The generated answer text from Gemini Pro for this prompt.
## Intended Use
This augmented dataset can be used for tasks such as:
- Training language models on prompt-answer pairs
- Evaluating the quality of generated answers
- Analyzing biases or limitations in Gemini Pro's outputs
- Data augmentation for other language tasks
## Limitations
- The generated answers come from a single language model (Gemini Pro) and may reflect biases of that model.
- The quality of the generated answers has not been manually verified.
- The prompts were filtered based only on the average quality score, other filtering criteria could be applied.
## Maintenance
This is currently a static dataset with no plans for updates. However, the process of generating answers could be repeated with different language models or prompts from the original `10k_prompts_ranked` dataset. |
arnepeine/medspeech | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 133517.0
num_examples: 3
download_size: 126401
dataset_size: 133517.0
---
# Dataset Card for "medspeech"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
demelin/wino_x | ---
annotations_creators:
- no-annotation
language:
- en
- de
- fr
- ru
language_creators:
- machine-generated
- expert-generated
license:
- mit
multilinguality:
- multilingual
- translation
pretty_name: Wino-X
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- translation
- coreference resolution
- commonsense reasoning
task_ids:
- multiple-choice-qa
- language-modeling
---
# Dataset Card for Wino-X
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** [Wino-X repository](https://github.com/demelin/Wino-X)
- **Repository:** [Wino-X repository](https://github.com/demelin/Wino-X)
- **Paper:** [Wino-X: Multilingual Winograd Schemas for Commonsense Reasoning and Coreference Resolution](https://aclanthology.org/2021.emnlp-main.670/)
- **Leaderboard:** [N/A]
- **Point of Contact:** [Denis Emelin](demelin.github.io)
### Dataset Summary
Wino-X is a parallel dataset of German, French, and Russian Winograd schemas, aligned with their English
counterparts, used to examine whether neural machine translation models can perform coreference resolution that
requires commonsense knowledge, and whether multilingual language models are capable of commonsense reasoning across
multiple languages.
### Supported Tasks and Leaderboards
- translation: The dataset can be used to evaluate translations of ambiguous source sentences, as produced by translation models . A [pretrained transformer-based NMT model](https://huggingface.co/Helsinki-NLP/opus-mt-en-de) can be used for this purpose.
- coreference-resolution: The dataset can be used to rank alternative translations of an ambiguous source sentence that differ in the chosen referent of an ambiguous source pronoun. A [pretrained transformer-based NMT model](https://huggingface.co/Helsinki-NLP/opus-mt-en-de) can be used for this purpose.
- commonsense-reasoning: The dataset can also be used evaluate whether pretrained multilingual language models can perform commonsense reasoning in (or across) multiple languages by identifying the correct filler in a cloze completion task. An [XLM-based model](https://huggingface.co/xlm-roberta-base) can be used for this purpose.
### Languages
The dataset (both its MT and LM portions) is available in the following translation pairs: English-German, English-French, English-Russian. All English sentences included in *Wino-X* were extracted from publicly available parallel corpora, as detailed in the accompanying paper, and represent the dataset-specific language varieties. All non-English sentences were obtained through machine translation and may, as such, exhibit features of translationese.
## Dataset Structure
### Data Instances
The following represents a typical *MT-Wino-X* instance (for the English-German translation pair):
{"qID": "3UDTAB6HH8D37OQL3O6F3GXEEOF09Z-1",
"sentence": "The woman looked for a different vase for the bouquet because it was too small.",
"translation1": "Die Frau suchte nach einer anderen Vase für den Blumenstrauß, weil sie zu klein war.",
"translation2": "Die Frau suchte nach einer anderen Vase für den Blumenstrauß, weil er zu klein war.",
"answer": 1,
"pronoun1": "sie",
"pronoun2": "er",
"referent1_en": "vase",
"referent2_en": "bouquet",
"true_translation_referent_of_pronoun1_de": "Vase",
"true_translation_referent_of_pronoun2_de": "Blumenstrauß",
"false_translation_referent_of_pronoun1_de": "Vase",
"false_translation_referent_of_pronoun2_de": "Blumenstrauß"}
The following represents a typical *LM-Wino-X* instance (for the English-French translation pair):
{"qID": "3UDTAB6HH8D37OQL3O6F3GXEEOF09Z-1",
"sentence": "The woman looked for a different vase for the bouquet because it was too small.",
"context_en": "The woman looked for a different vase for the bouquet because _ was too small.",
"context_fr": "La femme a cherché un vase différent pour le bouquet car _ était trop petit.",
"option1_en": "the bouquet",
"option2_en": "the vase",
"option1_fr": "le bouquet",
"option2_fr": "le vase",
"answer": 2,
"context_referent_of_option1_fr": "bouquet",
"context_referent_of_option2_fr": "vase"}
### Data Fields
For *MT-Wino-X*:
- "qID": Unique identifier ID for this dataset instance.
- "sentence": English sentence containing the ambiguous pronoun 'it'.
- "translation1": First translation candidate.
- "translation2": Second translation candidate.
- "answer": ID of the correct translation.
- "pronoun1": Translation of the ambiguous source pronoun in translation1.
- "pronoun2": Translation of the ambiguous source pronoun in translation2.
- "referent1_en": English referent of the translation of the ambiguous source pronoun in translation1.
- "referent2_en": English referent of the translation of the ambiguous source pronoun in translation2.
- "true_translation_referent_of_pronoun1_[TGT-LANG]": Target language referent of pronoun1 in the correct translation.
- "true_translation_referent_of_pronoun2_[TGT-LANG]": Target language referent of pronoun2 in the correct translation.
- "false_translation_referent_of_pronoun1_[TGT-LANG]": Target language referent of pronoun1 in the incorrect translation.
- "false_translation_referent_of_pronoun2_[TGT-LANG]": Target language referent of pronoun2 in the incorrect translation.
For *LM-Wino-X*:
- "qID": Unique identifier ID for this dataset instance.
- "sentence": English sentence containing the ambiguous pronoun 'it'.
- "context_en": Same English sentence, where 'it' is replaced by a gap.
- "context_fr": Target language translation of the English sentence, where the translation of 'it' is replaced by a gap.
- "option1_en": First filler option for the English sentence.
- "option2_en": Second filler option for the English sentence.
- "option1_[TGT-LANG]": First filler option for the target language sentence.
- "option2_[TGT-LANG]": Second filler option for the target language sentence.
- "answer": ID of the correct gap filler.
- "context_referent_of_option1_[TGT-LANG]": English translation of option1_[TGT-LANG].
- "context_referent_of_option2_[TGT-LANG]": English translation of option2_[TGT-LANG]
### Data Splits
*Wno-X* was designed as an evaluation-only benchmark and therefore is intended to be used for zero-shot testing only. However, users are very welcome to split the data as they wish :) .
## Dataset Creation
### Curation Rationale
Please refer to [Section 2 in the dataset paper](https://aclanthology.org/2021.emnlp-main.670.pdf).
### Source Data
#### Initial Data Collection and Normalization
Please refer to [Section 2 in the dataset paper](https://aclanthology.org/2021.emnlp-main.670.pdf).
#### Who are the source language producers?
Please refer to [Section 2 in the dataset paper](https://aclanthology.org/2021.emnlp-main.670.pdf).
### Annotations
#### Annotation process
Please refer to [Section 2 in the dataset paper](https://aclanthology.org/2021.emnlp-main.670.pdf).
#### Who are the annotators?
Annotations were generated automatically and verified by the dataset author / curator for correctness.
### Personal and Sensitive Information
[N/A]
## Considerations for Using the Data
### Social Impact of Dataset
Please refer to ['Ethical Considerations' in the dataset paper](https://aclanthology.org/2021.emnlp-main.670.pdf).
### Discussion of Biases
Please refer to ['Ethical Considerations' in the dataset paper](https://aclanthology.org/2021.emnlp-main.670.pdf).
### Other Known Limitations
Please refer to ['Ethical Considerations' in the dataset paper](https://aclanthology.org/2021.emnlp-main.670.pdf).
## Additional Information
### Dataset Curators
[Denis Emelin](demelin.github.io)
### Licensing Information
MIT
### Citation Information
@inproceedings{Emelin2021WinoXMW,
title={Wino-X: Multilingual Winograd Schemas for Commonsense Reasoning and Coreference Resolution},
author={Denis Emelin and Rico Sennrich},
booktitle={EMNLP},
year={2021}
} |
SuperSecureHuman/competition_math_hf_dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: problem
dtype: string
- name: level
dtype: string
- name: type
dtype: string
- name: solution
dtype: string
splits:
- name: train
num_bytes: 5984772
num_examples: 7500
- name: test
num_bytes: 3732833
num_examples: 5000
download_size: 4866025
dataset_size: 9717605
---
# Dataset Card for "competition_math_hf_dataset"
* **Homepage** - https://huggingface.co/datasets/hendrycks/competition_math
This is just the competetion math dataset, put in HF dataset format for ease of use with any finetuning tasks |
CyberHarem/an_shan_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of an_shan/鞍山/鞍山 (Azur Lane)
This is the dataset of an_shan/鞍山/鞍山 (Azur Lane), containing 25 images and their tags.
The core tags of this character are `green_eyes, long_hair, green_hair, ponytail, hair_ornament, bangs, hairclip, braid, very_long_hair, breasts, hat, horns`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 25 | 28.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/an_shan_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 25 | 17.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/an_shan_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 56 | 35.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/an_shan_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 25 | 25.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/an_shan_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 56 | 49.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/an_shan_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/an_shan_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------|
| 0 | 25 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, fingerless_gloves, epaulettes, black_gloves, long_sleeves, uniform |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | looking_at_viewer | solo | fingerless_gloves | epaulettes | black_gloves | long_sleeves | uniform |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:-------|:--------------------|:-------------|:---------------|:---------------|:----------|
| 0 | 25 |  |  |  |  |  | X | X | X | X | X | X | X | X | X |
|
Supreeta03/CREMA-audioData-labelled | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: emotion
dtype: string
splits:
- name: train
num_bytes: 390604918.1552518
num_examples: 4762
- name: validation
num_bytes: 96343298.96096534
num_examples: 1191
- name: test
num_bytes: 120976985.95778285
num_examples: 1489
download_size: 605873392
dataset_size: 607925203.074
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Fraternitas/out-of-scope | ---
task_categories:
- text-classification
language:
- en
size_categories:
- 10K<n<100K
---
Here we can find the datasets generated to make a out-of-context classifier (ElektraGo Chat).
There are different version of the dataset:
- High Imbalanced
- Imbalanced
- Balanced (data augmentation and undersampling)
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.