datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Tsuinzues/applejackmlp | ---
license: openrail
---
|
recwizard/redial_unicrs | ---
dataset_info:
features:
- name: messages
sequence: string
- name: rec
sequence: int32
- name: recNames
sequence: string
splits:
- name: train
num_bytes: 68323855
num_examples: 111458
- name: validation
num_bytes: 7454184
num_examples: 12395
- name: test
num_bytes: 8958856
num_examples: 15704
download_size: 7754891
dataset_size: 84736895
---
# Dataset Card for "redial_unicrs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SHS/cancer_test_data2 | ---
dataset_info:
features:
- name: passage
dtype: string
- name: passage_token
sequence: string
splits:
- name: train
num_bytes: 46724
num_examples: 1
download_size: 0
dataset_size: 46724
---
# Dataset Card for "cancer_test_data2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zhixiaoni/CROHME_channel_add_images | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 33394444.966
num_examples: 8834
download_size: 30300654
dataset_size: 33394444.966
---
# Dataset Card for "CROHME_channel_add_images"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cpatel321/Adobe_behaviour_image_sample | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 95065758
num_examples: 1398
dataset_name: 'adobe_behaviour_image_sample'
---
# Dataset Card for "adobe-behaviour-simulation-task-dataset's-images-and-captions"
sample dataset of length 1400 |
LambdaTests/VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_2_10000000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: response
dtype: string
splits:
- name: train
num_bytes: 191148
num_examples: 6699
download_size: 121747
dataset_size: 191148
---
# Dataset Card for "VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_2_10000000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
abir18/sample_suggestion_acceptance | ---
license: mit
---
|
MarkGG/Romance-baseline | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 39176840.7
num_examples: 1105002
- name: validation
num_bytes: 4352982.3
num_examples: 122778
download_size: 23278822
dataset_size: 43529823.0
---
# Dataset Card for "Romance-baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
macadeliccc/distilabel-neurology-preferences-2k-orca-format | ---
dataset_info:
features:
- name: system
dtype: string
- name: question
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 9755593
num_examples: 1994
download_size: 3840000
dataset_size: 9755593
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "distilabel-neurology-preferences-2k-orca-format"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
if you use this dataset please use this citation
```bibtex
@misc{tdolan_distilabel_neurology_preferences_2024,
author = {Tim Dolan},
title = {{Distilabel Neurology Preferences - 2k Samples}},
year = {2024},
howpublished = {Hugging Face Hub},
url = {https://huggingface.co/datasets/macadeliccc/distilabel-neurology-preferences-2k-orca-format}
}
``` |
bennsalter/caulking_images | ---
language:
- en
--- |
ryanramos/vqa-with-coco-annotated-no-images | ---
dataset_info:
features:
- name: license
dtype: int64
- name: file_name
dtype: string
- name: coco_url
dtype: string
- name: height
dtype: int64
- name: width
dtype: int64
- name: date_captured
dtype: string
- name: flickr_url
dtype: string
- name: captions
list:
- name: caption
dtype: string
- name: id
dtype: int64
- name: questions
list:
- name: answer_type
dtype: string
- name: answers
list:
- name: answer
dtype: string
- name: answer_confidence
dtype: string
- name: answer_id
dtype: int64
- name: image_id
dtype: int64
- name: multiple_choice_answer
dtype: string
- name: question
dtype: string
- name: question_id
dtype: int64
- name: question_type
dtype: string
- name: image_id
dtype: int64
- name: question
dtype: string
- name: answer
dtype: string
- name: qa_statement
dtype: string
splits:
- name: train
num_bytes: 197985750
num_examples: 82783
download_size: 49025655
dataset_size: 197985750
---
# Dataset Card for "vqa-with-coco-annotated-no-images"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/e9d30f3e | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 186
num_examples: 10
download_size: 1339
dataset_size: 186
---
# Dataset Card for "e9d30f3e"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GeneralRincewind/IndieMusicTokenizedv2 | ---
dataset_info:
features:
- name: original_sampling_rate
dtype: int64
- name: tokens
sequence:
sequence: int64
splits:
- name: train
num_bytes: 61969456
num_examples: 1291
download_size: 9693739
dataset_size: 61969456
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
LabelStudio/IMDB_Sample_100 | ---
tags:
- Tutorial
size_categories:
- n<1K
---
## Zero to One: Label Studio Tutorial Dataset
This dataset is used in the [Label Studio Zero to One Tutorial](https://hubs.ly/Q01CNlyy0). This dataset was originally provided by [Andrew Maas](https://ai.stanford.edu/~amaas/)([ref](https://ai.stanford.edu/~amaas/papers/wvSent_acl2011.bib)). This is an open and well-known dataset. The original dataset did have over 100,000 reviews.
### Parsing down 100,000 reviews to 100 reviews
To parse this dataset down to 100 reviews, (Chris Hoge)[https://huggingface.co/hogepodge] and myself((Erin Mikail Staples)[https://huggingface.co/erinmikail]) took the following steps.
It started by (writing a script)[https://s3.amazonaws.com/labelstud.io/datasets/IMDB_collect.py] that walked the directory structure to capture the data and metadata as rows of data. The data was written in randomized batches with rows corresponding to:
- 0 - 25,000: Labeled training data, with positive and negative sentiment mixed.
- 25,001 - 75000: Unlabeled training data.
- 75001 - 100,000: Labeled testing data, with positive and negative sentiment mixed.
These batches were also written out as separate files for convenience. Finally, the first 100 rows of each batch were written out as separate files to support faster loading for a streamlined learning experience.
Our thanks to Andrew Maas for having provided this free data set from their research.
## Did you try your hand at this tutorial?
We'd love to hear you share your results and how it worked out for you!
Did you build something else with the data?
Let us know! Join us in the (Label Studio Slack Community)[https://hubs.ly/Q01CNprb0] or drop us an (email)[mailto:community@labelstud.io]
## Enjoy what we're working on?
Drop us a star on (GitHub!)[https://hubs.ly/Q01CNp4W0]
|
bigbio/sem_eval_2024_task_2 | ---
language:
- en
bigbio_language:
- English
multilinguality: monolingual
pretty_name: SemEval 2024 Task 2
homepage: https://allenai.org/data/scitail
bigbio_pubmed: false
bigbio_public: true
bigbio_tasks:
- TEXTUAL_ENTAILMENT
---
# Dataset Card for SemEval 2024 Task 2
## Dataset Description
- **Homepage:** https://sites.google.com/view/nli4ct/semeval-2024?authuser=0
- **Pubmed:** False
- **Public:** True
- **Tasks:** TE
## Dataset
(Description copied from dataset homepage)
The statements and evidence are generated by clinical domain experts, clinical trial organisers, and research oncologists from the Cancer Research UK Manchester Institute and the Digital Experimental Cancer Medicine Team. There are a total of (TBD) statements split evenly across the different sections and classes.
## Description
Each Clinical Trial Report (CTR) consists of 4 sections:
Eligibility criteria - A set of conditions for patients to be allowed to take part in the clinical trial
Intervention - Information concerning the type, dosage, frequency, and duration of treatments being studied.
Results - Number of participants in the trial, outcome measures, units, and the results.
Adverse events - These are signs and symptoms observed in patients during the clinical trial.
For this task, each CTR may contain 1-2 patient groups, called cohorts or arms. These groups may receive different treatments, or have different baseline characteristics.
## Citation Information
```
@article{,
author = {},
title = {},
journal = {},
volume = {},
year = {},
url = {},
doi = {},
biburl = {},
bibsource = {}
}
|
joey234/mmlu-high_school_statistics-dev | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: dev
num_bytes: 4754
num_examples: 5
download_size: 0
dataset_size: 4754
---
# Dataset Card for "mmlu-high_school_statistics-dev"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_unit-mesh__autodev-deepseek-6.7b-finetunes-poc | ---
pretty_name: Evaluation run of unit-mesh/autodev-deepseek-6.7b-finetunes-poc
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [unit-mesh/autodev-deepseek-6.7b-finetunes-poc](https://huggingface.co/unit-mesh/autodev-deepseek-6.7b-finetunes-poc)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_unit-mesh__autodev-deepseek-6.7b-finetunes-poc\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-15T10:40:27.582189](https://huggingface.co/datasets/open-llm-leaderboard/details_unit-mesh__autodev-deepseek-6.7b-finetunes-poc/blob/main/results_2024-03-15T10-40-27.582189.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3755993742028831,\n\
\ \"acc_stderr\": 0.03423029641090857,\n \"acc_norm\": 0.3777627808573008,\n\
\ \"acc_norm_stderr\": 0.03497427995401455,\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.4411102243659991,\n\
\ \"mc2_stderr\": 0.01482842849226169\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.33276450511945393,\n \"acc_stderr\": 0.013769863046192307,\n\
\ \"acc_norm\": 0.35409556313993173,\n \"acc_norm_stderr\": 0.013975454122756555\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.40669189404501094,\n\
\ \"acc_stderr\": 0.004902125388002211,\n \"acc_norm\": 0.5240987851025692,\n\
\ \"acc_norm_stderr\": 0.004983982396187372\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.042446332383532286,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.042446332383532286\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3355263157894737,\n \"acc_stderr\": 0.03842498559395269,\n\
\ \"acc_norm\": 0.3355263157894737,\n \"acc_norm_stderr\": 0.03842498559395269\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.36,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3886792452830189,\n \"acc_stderr\": 0.03000048544867599,\n\
\ \"acc_norm\": 0.3886792452830189,\n \"acc_norm_stderr\": 0.03000048544867599\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3402777777777778,\n\
\ \"acc_stderr\": 0.03962135573486219,\n \"acc_norm\": 0.3402777777777778,\n\
\ \"acc_norm_stderr\": 0.03962135573486219\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3236994219653179,\n\
\ \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.3236994219653179,\n\
\ \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.038739587141493524,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.038739587141493524\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.33617021276595743,\n \"acc_stderr\": 0.030881618520676942,\n\
\ \"acc_norm\": 0.33617021276595743,\n \"acc_norm_stderr\": 0.030881618520676942\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.28835978835978837,\n \"acc_stderr\": 0.023330654054535892,\n \"\
acc_norm\": 0.28835978835978837,\n \"acc_norm_stderr\": 0.023330654054535892\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n\
\ \"acc_stderr\": 0.041049472699033945,\n \"acc_norm\": 0.30158730158730157,\n\
\ \"acc_norm_stderr\": 0.041049472699033945\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4064516129032258,\n\
\ \"acc_stderr\": 0.027941727346256308,\n \"acc_norm\": 0.4064516129032258,\n\
\ \"acc_norm_stderr\": 0.027941727346256308\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3054187192118227,\n \"acc_stderr\": 0.032406615658684086,\n\
\ \"acc_norm\": 0.3054187192118227,\n \"acc_norm_stderr\": 0.032406615658684086\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.36363636363636365,\n \"acc_stderr\": 0.03756335775187896,\n\
\ \"acc_norm\": 0.36363636363636365,\n \"acc_norm_stderr\": 0.03756335775187896\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4595959595959596,\n \"acc_stderr\": 0.035507024651313425,\n \"\
acc_norm\": 0.4595959595959596,\n \"acc_norm_stderr\": 0.035507024651313425\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.39378238341968913,\n \"acc_stderr\": 0.03526077095548237,\n\
\ \"acc_norm\": 0.39378238341968913,\n \"acc_norm_stderr\": 0.03526077095548237\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.36923076923076925,\n \"acc_stderr\": 0.024468615241478905,\n\
\ \"acc_norm\": 0.36923076923076925,\n \"acc_norm_stderr\": 0.024468615241478905\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815635,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815635\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3865546218487395,\n \"acc_stderr\": 0.03163145807552379,\n \
\ \"acc_norm\": 0.3865546218487395,\n \"acc_norm_stderr\": 0.03163145807552379\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.363302752293578,\n \"acc_stderr\": 0.020620603919625804,\n \"\
acc_norm\": 0.363302752293578,\n \"acc_norm_stderr\": 0.020620603919625804\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.32407407407407407,\n \"acc_stderr\": 0.03191923445686186,\n \"\
acc_norm\": 0.32407407407407407,\n \"acc_norm_stderr\": 0.03191923445686186\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.38235294117647056,\n \"acc_stderr\": 0.03410785338904719,\n \"\
acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.03410785338904719\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.379746835443038,\n \"acc_stderr\": 0.031591887529658504,\n \
\ \"acc_norm\": 0.379746835443038,\n \"acc_norm_stderr\": 0.031591887529658504\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.37668161434977576,\n\
\ \"acc_stderr\": 0.03252113489929187,\n \"acc_norm\": 0.37668161434977576,\n\
\ \"acc_norm_stderr\": 0.03252113489929187\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4122137404580153,\n \"acc_stderr\": 0.04317171194870254,\n\
\ \"acc_norm\": 0.4122137404580153,\n \"acc_norm_stderr\": 0.04317171194870254\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.47107438016528924,\n \"acc_stderr\": 0.04556710331269498,\n \"\
acc_norm\": 0.47107438016528924,\n \"acc_norm_stderr\": 0.04556710331269498\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.32407407407407407,\n\
\ \"acc_stderr\": 0.04524596007030048,\n \"acc_norm\": 0.32407407407407407,\n\
\ \"acc_norm_stderr\": 0.04524596007030048\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4294478527607362,\n \"acc_stderr\": 0.03889066619112722,\n\
\ \"acc_norm\": 0.4294478527607362,\n \"acc_norm_stderr\": 0.03889066619112722\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.4077669902912621,\n \"acc_stderr\": 0.048657775704107696,\n\
\ \"acc_norm\": 0.4077669902912621,\n \"acc_norm_stderr\": 0.048657775704107696\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6153846153846154,\n\
\ \"acc_stderr\": 0.03187195347942466,\n \"acc_norm\": 0.6153846153846154,\n\
\ \"acc_norm_stderr\": 0.03187195347942466\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.41890166028097064,\n\
\ \"acc_stderr\": 0.01764320505237719,\n \"acc_norm\": 0.41890166028097064,\n\
\ \"acc_norm_stderr\": 0.01764320505237719\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.407514450867052,\n \"acc_stderr\": 0.0264545781469315,\n\
\ \"acc_norm\": 0.407514450867052,\n \"acc_norm_stderr\": 0.0264545781469315\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.264804469273743,\n\
\ \"acc_stderr\": 0.014756906483260664,\n \"acc_norm\": 0.264804469273743,\n\
\ \"acc_norm_stderr\": 0.014756906483260664\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.38562091503267976,\n \"acc_stderr\": 0.02787074527829032,\n\
\ \"acc_norm\": 0.38562091503267976,\n \"acc_norm_stderr\": 0.02787074527829032\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3665594855305466,\n\
\ \"acc_stderr\": 0.027368078243971625,\n \"acc_norm\": 0.3665594855305466,\n\
\ \"acc_norm_stderr\": 0.027368078243971625\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.026229649178821167,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.026229649178821167\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3120567375886525,\n \"acc_stderr\": 0.027640120545169938,\n \
\ \"acc_norm\": 0.3120567375886525,\n \"acc_norm_stderr\": 0.027640120545169938\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2920469361147327,\n\
\ \"acc_stderr\": 0.011613349136271815,\n \"acc_norm\": 0.2920469361147327,\n\
\ \"acc_norm_stderr\": 0.011613349136271815\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3161764705882353,\n \"acc_stderr\": 0.02824568739146291,\n\
\ \"acc_norm\": 0.3161764705882353,\n \"acc_norm_stderr\": 0.02824568739146291\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3366013071895425,\n \"acc_stderr\": 0.01911721391149515,\n \
\ \"acc_norm\": 0.3366013071895425,\n \"acc_norm_stderr\": 0.01911721391149515\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4636363636363636,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.4636363636363636,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4489795918367347,\n \"acc_stderr\": 0.03184213866687579,\n\
\ \"acc_norm\": 0.4489795918367347,\n \"acc_norm_stderr\": 0.03184213866687579\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4129353233830846,\n\
\ \"acc_stderr\": 0.03481520803367348,\n \"acc_norm\": 0.4129353233830846,\n\
\ \"acc_norm_stderr\": 0.03481520803367348\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3614457831325301,\n\
\ \"acc_stderr\": 0.0374005938202932,\n \"acc_norm\": 0.3614457831325301,\n\
\ \"acc_norm_stderr\": 0.0374005938202932\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.39766081871345027,\n \"acc_stderr\": 0.0375363895576169,\n\
\ \"acc_norm\": 0.39766081871345027,\n \"acc_norm_stderr\": 0.0375363895576169\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.4411102243659991,\n\
\ \"mc2_stderr\": 0.01482842849226169\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5666929755327546,\n \"acc_stderr\": 0.013926915052757336\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1956027293404094,\n \
\ \"acc_stderr\": 0.010926096810556464\n }\n}\n```"
repo_url: https://huggingface.co/unit-mesh/autodev-deepseek-6.7b-finetunes-poc
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|arc:challenge|25_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|gsm8k|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hellaswag|10_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T10-40-27.582189.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-15T10-40-27.582189.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- '**/details_harness|winogrande|5_2024-03-15T10-40-27.582189.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-15T10-40-27.582189.parquet'
- config_name: results
data_files:
- split: 2024_03_15T10_40_27.582189
path:
- results_2024-03-15T10-40-27.582189.parquet
- split: latest
path:
- results_2024-03-15T10-40-27.582189.parquet
---
# Dataset Card for Evaluation run of unit-mesh/autodev-deepseek-6.7b-finetunes-poc
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [unit-mesh/autodev-deepseek-6.7b-finetunes-poc](https://huggingface.co/unit-mesh/autodev-deepseek-6.7b-finetunes-poc) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_unit-mesh__autodev-deepseek-6.7b-finetunes-poc",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-15T10:40:27.582189](https://huggingface.co/datasets/open-llm-leaderboard/details_unit-mesh__autodev-deepseek-6.7b-finetunes-poc/blob/main/results_2024-03-15T10-40-27.582189.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3755993742028831,
"acc_stderr": 0.03423029641090857,
"acc_norm": 0.3777627808573008,
"acc_norm_stderr": 0.03497427995401455,
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.4411102243659991,
"mc2_stderr": 0.01482842849226169
},
"harness|arc:challenge|25": {
"acc": 0.33276450511945393,
"acc_stderr": 0.013769863046192307,
"acc_norm": 0.35409556313993173,
"acc_norm_stderr": 0.013975454122756555
},
"harness|hellaswag|10": {
"acc": 0.40669189404501094,
"acc_stderr": 0.004902125388002211,
"acc_norm": 0.5240987851025692,
"acc_norm_stderr": 0.004983982396187372
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.042446332383532286,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.042446332383532286
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3355263157894737,
"acc_stderr": 0.03842498559395269,
"acc_norm": 0.3355263157894737,
"acc_norm_stderr": 0.03842498559395269
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3886792452830189,
"acc_stderr": 0.03000048544867599,
"acc_norm": 0.3886792452830189,
"acc_norm_stderr": 0.03000048544867599
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3402777777777778,
"acc_stderr": 0.03962135573486219,
"acc_norm": 0.3402777777777778,
"acc_norm_stderr": 0.03962135573486219
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3236994219653179,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.3236994219653179,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.038739587141493524,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.038739587141493524
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.33617021276595743,
"acc_stderr": 0.030881618520676942,
"acc_norm": 0.33617021276595743,
"acc_norm_stderr": 0.030881618520676942
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.28835978835978837,
"acc_stderr": 0.023330654054535892,
"acc_norm": 0.28835978835978837,
"acc_norm_stderr": 0.023330654054535892
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.041049472699033945,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.041049472699033945
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4064516129032258,
"acc_stderr": 0.027941727346256308,
"acc_norm": 0.4064516129032258,
"acc_norm_stderr": 0.027941727346256308
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3054187192118227,
"acc_stderr": 0.032406615658684086,
"acc_norm": 0.3054187192118227,
"acc_norm_stderr": 0.032406615658684086
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.36363636363636365,
"acc_stderr": 0.03756335775187896,
"acc_norm": 0.36363636363636365,
"acc_norm_stderr": 0.03756335775187896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4595959595959596,
"acc_stderr": 0.035507024651313425,
"acc_norm": 0.4595959595959596,
"acc_norm_stderr": 0.035507024651313425
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.39378238341968913,
"acc_stderr": 0.03526077095548237,
"acc_norm": 0.39378238341968913,
"acc_norm_stderr": 0.03526077095548237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.36923076923076925,
"acc_stderr": 0.024468615241478905,
"acc_norm": 0.36923076923076925,
"acc_norm_stderr": 0.024468615241478905
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815635,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815635
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3865546218487395,
"acc_stderr": 0.03163145807552379,
"acc_norm": 0.3865546218487395,
"acc_norm_stderr": 0.03163145807552379
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.363302752293578,
"acc_stderr": 0.020620603919625804,
"acc_norm": 0.363302752293578,
"acc_norm_stderr": 0.020620603919625804
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.03191923445686186,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.03191923445686186
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.03410785338904719,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.03410785338904719
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.379746835443038,
"acc_stderr": 0.031591887529658504,
"acc_norm": 0.379746835443038,
"acc_norm_stderr": 0.031591887529658504
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.37668161434977576,
"acc_stderr": 0.03252113489929187,
"acc_norm": 0.37668161434977576,
"acc_norm_stderr": 0.03252113489929187
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4122137404580153,
"acc_stderr": 0.04317171194870254,
"acc_norm": 0.4122137404580153,
"acc_norm_stderr": 0.04317171194870254
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.47107438016528924,
"acc_stderr": 0.04556710331269498,
"acc_norm": 0.47107438016528924,
"acc_norm_stderr": 0.04556710331269498
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.04524596007030048,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.04524596007030048
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4294478527607362,
"acc_stderr": 0.03889066619112722,
"acc_norm": 0.4294478527607362,
"acc_norm_stderr": 0.03889066619112722
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.4077669902912621,
"acc_stderr": 0.048657775704107696,
"acc_norm": 0.4077669902912621,
"acc_norm_stderr": 0.048657775704107696
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.03187195347942466,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.03187195347942466
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.41890166028097064,
"acc_stderr": 0.01764320505237719,
"acc_norm": 0.41890166028097064,
"acc_norm_stderr": 0.01764320505237719
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.407514450867052,
"acc_stderr": 0.0264545781469315,
"acc_norm": 0.407514450867052,
"acc_norm_stderr": 0.0264545781469315
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.264804469273743,
"acc_stderr": 0.014756906483260664,
"acc_norm": 0.264804469273743,
"acc_norm_stderr": 0.014756906483260664
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.38562091503267976,
"acc_stderr": 0.02787074527829032,
"acc_norm": 0.38562091503267976,
"acc_norm_stderr": 0.02787074527829032
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3665594855305466,
"acc_stderr": 0.027368078243971625,
"acc_norm": 0.3665594855305466,
"acc_norm_stderr": 0.027368078243971625
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.026229649178821167,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.026229649178821167
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3120567375886525,
"acc_stderr": 0.027640120545169938,
"acc_norm": 0.3120567375886525,
"acc_norm_stderr": 0.027640120545169938
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2920469361147327,
"acc_stderr": 0.011613349136271815,
"acc_norm": 0.2920469361147327,
"acc_norm_stderr": 0.011613349136271815
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3161764705882353,
"acc_stderr": 0.02824568739146291,
"acc_norm": 0.3161764705882353,
"acc_norm_stderr": 0.02824568739146291
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3366013071895425,
"acc_stderr": 0.01911721391149515,
"acc_norm": 0.3366013071895425,
"acc_norm_stderr": 0.01911721391149515
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4636363636363636,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.4636363636363636,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4489795918367347,
"acc_stderr": 0.03184213866687579,
"acc_norm": 0.4489795918367347,
"acc_norm_stderr": 0.03184213866687579
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4129353233830846,
"acc_stderr": 0.03481520803367348,
"acc_norm": 0.4129353233830846,
"acc_norm_stderr": 0.03481520803367348
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3614457831325301,
"acc_stderr": 0.0374005938202932,
"acc_norm": 0.3614457831325301,
"acc_norm_stderr": 0.0374005938202932
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.39766081871345027,
"acc_stderr": 0.0375363895576169,
"acc_norm": 0.39766081871345027,
"acc_norm_stderr": 0.0375363895576169
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.4411102243659991,
"mc2_stderr": 0.01482842849226169
},
"harness|winogrande|5": {
"acc": 0.5666929755327546,
"acc_stderr": 0.013926915052757336
},
"harness|gsm8k|5": {
"acc": 0.1956027293404094,
"acc_stderr": 0.010926096810556464
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
haurajahra/SQUAD_id | ---
license: other
task_categories:
- question-answering
language:
- id
size_categories:
- 100K<n<1M
--- |
lodeawb/wbfns | ---
language:
- en
license: mit
tags:
- natural-language-understanding
size_categories:
- n<1K
task_categories:
- summarization
---
# Dataset Card for wbfns 2018
42 publicly-available document texts downloaded from the World Bank Documents and Report API.
## Dataset Details
### Dataset Description
42 World Bank document texts, related to Nutrition and food security, published in 2018. All documents are publicly available from the World Bank Project API, here: https://documents.worldbank.org/en/publication/documents-reports/api
- **License:** mit
## Uses
Intended to be used in very short text summarisation task.
### Out-of-Scope Use
Not intended to be used for any other purposes.
## Dataset Structure
"id" = World Bank document ID number.
"admreg" = Administrative region.
"count" = The country or countries covered by the document.
"docty" = The type of document, such as 'Project Paper' or 'Working Paper'.
"theme" = Comma-separated list of themes which the document pertains to.
"docdt" = Date on which the document was published.
"majdocty" = Document type according to main usage e.g. 'Project Documents'.
"pdfurl" = Public URL from which the PDF version of the document can be accessed.
"txturl" = Public URL from which the TXT version of the document can be accessed.
"url_friendly_title" = Public parent URL at which the document is hosted.
"projectid" = World Bank Project ID.
"url" = Alternate parent URL at the document is hosted.
"doc-text" = Contents of the 'txturl', above.
## Dataset Creation
### Curation Rationale
Serves as material for short sample exercise in text summarisation.
## Dataset Card Contact
lodea@worldbank.org |
Jayeshkumarjangir/memegen_jokes_1217 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 3246484
num_examples: 12317
download_size: 1697021
dataset_size: 3246484
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hyojin99/EBRC | ---
dataset_info:
config_name: hyojin99/EBRC
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: Text
dtype: string
splits:
- name: train
num_bytes: 3208084548.964
num_examples: 39444
- name: test
num_bytes: 397522251.37619185
num_examples: 4931
- name: valid
num_bytes: 400919946.6118081
num_examples: 4930
download_size: 2236327443
dataset_size: 4006526746.952
configs:
- config_name: hyojin99/EBRC
data_files:
- split: train
path: hyojin99/EBRC/train-*
- split: test
path: hyojin99/EBRC/test-*
- split: valid
path: hyojin99/EBRC/valid-*
---
|
CyberHarem/shinki_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of shinki/神綺/신키 (Touhou)
This is the dataset of shinki/神綺/신키 (Touhou), containing 500 images and their tags.
The core tags of this character are `long_hair, one_side_up, hair_ornament, wings, blue_eyes, white_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 471.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shinki_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 327.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shinki_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 964 | 589.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shinki_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 438.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shinki_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 964 | 742.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shinki_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/shinki_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, dress, hair_bobbles, smile, solo, capelet |
| 1 | 10 |  |  |  |  |  | 1girl, dress, hair_bobbles, red_capelet, solo, red_eyes, smile, multiple_wings |
| 2 | 5 |  |  |  |  |  | 1girl, hair_bobbles, looking_at_viewer, red_capelet, simple_background, smile, solo, white_background, long_sleeves, multiple_wings, very_long_hair, red_dress, grey_hair, purple_eyes, wide_sleeves |
| 3 | 13 |  |  |  |  |  | 1girl, bangs, hair_bobbles, looking_at_viewer, red_capelet, red_dress, solo, closed_mouth, smile, long_sleeves, wide_sleeves, grey_eyes, grey_hair, blush, simple_background, white_background, ribbon, multiple_wings, upper_body, very_long_hair |
| 4 | 5 |  |  |  |  |  | 1girl, bare_shoulders, collarbone, grey_hair, hair_bobbles, large_breasts, lips, long_sleeves, looking_at_viewer, off_shoulder, red_dress, simple_background, smile, solo, white_background, closed_mouth, grey_eyes, very_long_hair, upper_body, cleavage, cowboy_shot, criss-cross_halter, navel, one-hour_drawing_challenge, parted_bangs, red_capelet, stomach, turtleneck, underboob, white_bikini |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | dress | hair_bobbles | smile | solo | capelet | red_capelet | red_eyes | multiple_wings | looking_at_viewer | simple_background | white_background | long_sleeves | very_long_hair | red_dress | grey_hair | purple_eyes | wide_sleeves | bangs | closed_mouth | grey_eyes | blush | ribbon | upper_body | bare_shoulders | collarbone | large_breasts | lips | off_shoulder | cleavage | cowboy_shot | criss-cross_halter | navel | one-hour_drawing_challenge | parted_bangs | stomach | turtleneck | underboob | white_bikini |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:---------------|:--------|:-------|:----------|:--------------|:-----------|:-----------------|:--------------------|:--------------------|:-------------------|:---------------|:-----------------|:------------|:------------|:--------------|:---------------|:--------|:---------------|:------------|:--------|:---------|:-------------|:-----------------|:-------------|:----------------|:-------|:---------------|:-----------|:--------------|:---------------------|:--------|:-----------------------------|:---------------|:----------|:-------------|:------------|:---------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | X | X | X | | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 3 | 13 |  |  |  |  |  | X | | X | X | X | | X | | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | X | X | X | | X | | | X | X | X | X | X | X | X | | | | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
fmars/wiki_stem | ---
license: openrail
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 297704620
num_examples: 675700
download_size: 170914035
dataset_size: 297704620
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nitinbhayana/beauty_grocery_sports_multivitamin_title_reverse_ner | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 210581
num_examples: 561
download_size: 97314
dataset_size: 210581
---
# Dataset Card for "beauty_grocery_sports_multivitamin_title_reverse_ner"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/df0ba866 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 188
num_examples: 10
download_size: 1335
dataset_size: 188
---
# Dataset Card for "df0ba866"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
luckyeven/SROIE2019 | ---
license: unknown
---
|
ccw7463/Ko_QnA_ver0.4 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: input
dtype: string
- name: ref
dtype: string
- name: context
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 327432073.1375209
num_examples: 245481
download_size: 139034256
dataset_size: 327432073.1375209
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
🚀 Dataset Info
- simple qna : 157469 개
- context qna : 88012 개
- Ref (used)
- beomi/KoAlpaca-v1.1a : 21155 개
- HumanF-MarkrAI/WIKI_QA_Near_dedup : 137505 개
- squad_kor_v1 : 66181 개
- https://github.com/KLUE-benchmark/KLUE/tree/main/klue_benchmark/klue-mrc-v1.1 : 21831개
- preocessing
- (1) beomi/KoAlpaca-v1.1a : '추가 정보...', '추가 답변...', '덧붙이는 답변...'
- (2) all : change formatting
- (3) all : 이~는, 이~은 처럼 앞에 대상에대한 설명이 없는 Instruction일 경우 제거
- Note (not used because of quality)
- GSON-backup/KoInstruct-QA
- juletxara/tydiqa_xtreme
- Bingsu/ko_alpaca_data
- psymon/namuwiki_alpaca_dataset
- nlpai-lab/openassistant-guanaco-ko (ccw7463/Ko_QnA_ver0.3)
- simple_qna
```python
{'instruction': '양파는 어떤 식물 부위인가요? 그리고 고구마는 뿌리인가요?',
'output': '양파는 잎이 아닌 식물의 줄기 부분입니다. 고구마는 식물의 뿌리 부분입니다. \n\n식물의 부위의 구분에 대해 궁금해하는 분이라면 분명 이 질문에 대한 답을 찾고 있을 것입니다. 양파는 잎이 아닌 줄기 부분입니다. 고구마는 다른 질문과 답변에서 언급된 것과 같이 뿌리 부분입니다. 따라서, 양파는 식물의 줄기 부분이 되고, 고구마는 식물의 뿌리 부분입니다.\n\n ',
'category': 'simpe_qna',
'input': '',
'ref': 'beomi/KoAlpaca-v1.1a',
'context': ''}
```
- context_qna
```python
{'instruction': '주어진 정보를 참고하여 사용자의 질문에 답변해주세요.',
'output': '강철',
'category': 'context_qna',
'input': '마린 라이너용의 전동차 차체의 재질은 무엇인가?',
'ref': 'squad_kor_v1',
'context': "JR 서일본이 1988년 4월 10일의 혼시비산 선 자야마치 역 - 우타즈 역간(혼시비산 선과 우노 선 오카야마 역 - 자야마치 역간과 요산 선 우타즈 역 - 다카마쓰 역간을 총칭하여 세토대교 선의 애칭을 붙일 수 있었다)의 개업시 신조한 단체·이벤트용 차량으로, 쾌속 '마린 라이너'용의 1등차 쿠로 212형과 같은 사양의 3량 편성이다. 세토대교 선 개업시에는 황태자 부부를 태워 주행한 실적이 있다. 쾌속 '마린 라이너'용의 전동차는 1M방식의 213계지만 이 편성은 보통 강철 차체로 게다가 중량 증가 때문에 전동차는 유닛 방식의 211계(쿠모로 211형·모로 210형)가 되었다. 또한 이 편성은 JR 서일본의 직류 전화 구간의 전선 운용을 가능하게 하기 때문에 내한내설구조 및 최고 운행 속도가 120km/h로 되어 있어 편성을 조성하는 쿠로 212형은 1000번대로 구분되고 있다. 덧붙여 1997년 3월 전차량 리뉴얼 공사가 시공되고 있다. 다만 외장은 그대로 유지하고 있다. 1988년도 굿 디자인 상품(현·굿 디자인상)으로 선정되었다."}
``` |
HamdanXI/gloss_merged_dataset_with_adj_adv | ---
dataset_info:
features:
- name: gloss
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 20161967
num_examples: 144285
download_size: 12124595
dataset_size: 20161967
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "gloss_merged_dataset_with_adj_adv"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
manu/dila_legifrance | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 4399589467
num_examples: 2349748
download_size: 1326748165
dataset_size: 4399589467
---
# Dataset Card for "dila_legifrance"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_NLUHOPOE__experiment2-cause-non-qLoRa | ---
pretty_name: Evaluation run of NLUHOPOE/experiment2-cause-non-qLoRa
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NLUHOPOE/experiment2-cause-non-qLoRa](https://huggingface.co/NLUHOPOE/experiment2-cause-non-qLoRa)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NLUHOPOE__experiment2-cause-non-qLoRa\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-02T01:32:53.076387](https://huggingface.co/datasets/open-llm-leaderboard/details_NLUHOPOE__experiment2-cause-non-qLoRa/blob/main/results_2024-03-02T01-32-53.076387.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6197987667684087,\n\
\ \"acc_stderr\": 0.03270734461121569,\n \"acc_norm\": 0.6261639848463227,\n\
\ \"acc_norm_stderr\": 0.03337642036997771,\n \"mc1\": 0.30966952264381886,\n\
\ \"mc1_stderr\": 0.016185744355144912,\n \"mc2\": 0.45469695402927457,\n\
\ \"mc2_stderr\": 0.01450788864306172\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5588737201365188,\n \"acc_stderr\": 0.014509747749064663,\n\
\ \"acc_norm\": 0.6032423208191127,\n \"acc_norm_stderr\": 0.014296513020180639\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6211909978092014,\n\
\ \"acc_stderr\": 0.00484099059349469,\n \"acc_norm\": 0.8292172873929496,\n\
\ \"acc_norm_stderr\": 0.0037554989417818516\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926604,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926604\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.41,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.03724249595817731,\n\
\ \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.03724249595817731\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n\
\ \"acc_stderr\": 0.04755129616062946,\n \"acc_norm\": 0.35294117647058826,\n\
\ \"acc_norm_stderr\": 0.04755129616062946\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5276595744680851,\n\
\ \"acc_stderr\": 0.03263597118409769,\n \"acc_norm\": 0.5276595744680851,\n\
\ \"acc_norm_stderr\": 0.03263597118409769\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04677473004491199,\n\
\ \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04677473004491199\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.6068965517241379,\n \"acc_stderr\": 0.040703290137070705,\n \"\
acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.040703290137070705\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.025107425481137285,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.025107425481137285\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7322580645161291,\n \"acc_stderr\": 0.02518900666021238,\n \"\
acc_norm\": 0.7322580645161291,\n \"acc_norm_stderr\": 0.02518900666021238\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"\
acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n\
\ \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059285,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059285\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8018348623853211,\n \"acc_stderr\": 0.017090573804217905,\n \"\
acc_norm\": 0.8018348623853211,\n \"acc_norm_stderr\": 0.017090573804217905\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5462962962962963,\n \"acc_stderr\": 0.03395322726375798,\n \"\
acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.03395322726375798\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7468354430379747,\n \"acc_stderr\": 0.028304657943035307,\n \
\ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.028304657943035307\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.02308663508684141,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.02308663508684141\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7969348659003831,\n\
\ \"acc_stderr\": 0.014385525076611573,\n \"acc_norm\": 0.7969348659003831,\n\
\ \"acc_norm_stderr\": 0.014385525076611573\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.02494679222527231,\n\
\ \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.02494679222527231\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3888268156424581,\n\
\ \"acc_stderr\": 0.016303899530796136,\n \"acc_norm\": 0.3888268156424581,\n\
\ \"acc_norm_stderr\": 0.016303899530796136\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6688102893890675,\n\
\ \"acc_stderr\": 0.02673062072800491,\n \"acc_norm\": 0.6688102893890675,\n\
\ \"acc_norm_stderr\": 0.02673062072800491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.02555765398186806,\n\
\ \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.02555765398186806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236844,\n \
\ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236844\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4380704041720991,\n\
\ \"acc_stderr\": 0.01267190278256765,\n \"acc_norm\": 0.4380704041720991,\n\
\ \"acc_norm_stderr\": 0.01267190278256765\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6102941176470589,\n \"acc_stderr\": 0.029624663581159703,\n\
\ \"acc_norm\": 0.6102941176470589,\n \"acc_norm_stderr\": 0.029624663581159703\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6372549019607843,\n \"acc_stderr\": 0.019450768432505514,\n \
\ \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.019450768432505514\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.02927956741106568,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.02927956741106568\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30966952264381886,\n\
\ \"mc1_stderr\": 0.016185744355144912,\n \"mc2\": 0.45469695402927457,\n\
\ \"mc2_stderr\": 0.01450788864306172\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7805840568271507,\n \"acc_stderr\": 0.01163126836060778\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.33586050037907506,\n \
\ \"acc_stderr\": 0.013009224714267353\n }\n}\n```"
repo_url: https://huggingface.co/NLUHOPOE/experiment2-cause-non-qLoRa
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|arc:challenge|25_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|gsm8k|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hellaswag|10_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T01-32-53.076387.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-02T01-32-53.076387.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- '**/details_harness|winogrande|5_2024-03-02T01-32-53.076387.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-02T01-32-53.076387.parquet'
- config_name: results
data_files:
- split: 2024_03_02T01_32_53.076387
path:
- results_2024-03-02T01-32-53.076387.parquet
- split: latest
path:
- results_2024-03-02T01-32-53.076387.parquet
---
# Dataset Card for Evaluation run of NLUHOPOE/experiment2-cause-non-qLoRa
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NLUHOPOE/experiment2-cause-non-qLoRa](https://huggingface.co/NLUHOPOE/experiment2-cause-non-qLoRa) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NLUHOPOE__experiment2-cause-non-qLoRa",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-02T01:32:53.076387](https://huggingface.co/datasets/open-llm-leaderboard/details_NLUHOPOE__experiment2-cause-non-qLoRa/blob/main/results_2024-03-02T01-32-53.076387.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6197987667684087,
"acc_stderr": 0.03270734461121569,
"acc_norm": 0.6261639848463227,
"acc_norm_stderr": 0.03337642036997771,
"mc1": 0.30966952264381886,
"mc1_stderr": 0.016185744355144912,
"mc2": 0.45469695402927457,
"mc2_stderr": 0.01450788864306172
},
"harness|arc:challenge|25": {
"acc": 0.5588737201365188,
"acc_stderr": 0.014509747749064663,
"acc_norm": 0.6032423208191127,
"acc_norm_stderr": 0.014296513020180639
},
"harness|hellaswag|10": {
"acc": 0.6211909978092014,
"acc_stderr": 0.00484099059349469,
"acc_norm": 0.8292172873929496,
"acc_norm_stderr": 0.0037554989417818516
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926604,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.03724249595817731,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.03724249595817731
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.040703290137070705,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.040703290137070705
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.025107425481137285,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.025107425481137285
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7322580645161291,
"acc_stderr": 0.02518900666021238,
"acc_norm": 0.7322580645161291,
"acc_norm_stderr": 0.02518900666021238
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015184,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059285,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059285
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8018348623853211,
"acc_stderr": 0.017090573804217905,
"acc_norm": 0.8018348623853211,
"acc_norm_stderr": 0.017090573804217905
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.03395322726375798,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.03395322726375798
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.028304657943035307,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.028304657943035307
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.02308663508684141,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.02308663508684141
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7969348659003831,
"acc_stderr": 0.014385525076611573,
"acc_norm": 0.7969348659003831,
"acc_norm_stderr": 0.014385525076611573
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.02494679222527231,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.02494679222527231
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3888268156424581,
"acc_stderr": 0.016303899530796136,
"acc_norm": 0.3888268156424581,
"acc_norm_stderr": 0.016303899530796136
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6688102893890675,
"acc_stderr": 0.02673062072800491,
"acc_norm": 0.6688102893890675,
"acc_norm_stderr": 0.02673062072800491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6975308641975309,
"acc_stderr": 0.02555765398186806,
"acc_norm": 0.6975308641975309,
"acc_norm_stderr": 0.02555765398186806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236844,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236844
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4380704041720991,
"acc_stderr": 0.01267190278256765,
"acc_norm": 0.4380704041720991,
"acc_norm_stderr": 0.01267190278256765
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6102941176470589,
"acc_stderr": 0.029624663581159703,
"acc_norm": 0.6102941176470589,
"acc_norm_stderr": 0.029624663581159703
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.019450768432505514,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.019450768432505514
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.02927956741106568,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.02927956741106568
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30966952264381886,
"mc1_stderr": 0.016185744355144912,
"mc2": 0.45469695402927457,
"mc2_stderr": 0.01450788864306172
},
"harness|winogrande|5": {
"acc": 0.7805840568271507,
"acc_stderr": 0.01163126836060778
},
"harness|gsm8k|5": {
"acc": 0.33586050037907506,
"acc_stderr": 0.013009224714267353
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Nexdata/Pushtu_Conversational_Speech_Data_by_Telephone | ---
language:
- ps
task_categories:
- conversational
- automatic-speech-recognition
---
---
# Dataset Card for Nexdata/Pushtu_Conversational_Speech_Data_by_Telephone
## Description
The 200 Hours - Pushtu Conversational Speech Data collected by telephone involved more than 230 native speakers, developed with proper balance of gender ratio, Speakers would choose a few familiar topics out of the given list and start conversations to ensure dialogues' fluency and naturalness. The recording devices are various mobile phones. The audio format is 8kHz, 8bit, WAV, and all the speech data was recorded in quiet indoor environments. All the speech audio was manually transcribed with text content, the start and end time of each effective sentence, and speaker identification.
For more details, please refer to the link: https://www.nexdata.ai/datasets/1248?source=Huggingface
# Specifications
## Format
8kHz, 8bit, wav, mono channel;
## Recording Environment
quiet indoor environment, without echo;
## Recording content
dozens of topics are specified, and the speakers make dialogue under those topics while the recording is performed;
## Demographics
About 230 people.
## Annotation
annotating for the transcription text, speaker identification and gender
## Device
Telephony recording system;
## Language
Pushtu
## Application scenarios
speech recognition; voiceprint recognition;
## Accuracy rate
the word accuracy rate is not less than 95%
# Licensing Information
Commercial License |
milktruck/OABTcleaned | ---
license: apache-2.0
---
|
Ar4ikov/civitai-sd-337k | ---
annotations_creators:
- no-annotation
language_creators:
- thefcraft
language:
- en
pretty_name: civitai-stable-diffusion-337k
size_categories:
- 1M<n<10M
source_datasets:
- civitai
duplicated_from: thefcraft/civitai-stable-diffusion-337k
---
### Dataset Summary
dataset:- civitai-stable-diffusion-337k this dataset contains 337k civitai images url with prompts etc. i use civitai api to get all prompts.
project:- https://github.com/thefcraft/nsfw-prompt-detection-sd I train a model on this dataset
DATA STRUCTURE for .civitai.json:-
```{
'items':[
{'id': 100657,
'url': 'https://imagecache.civitai.com/xG1nkqKTMzGDvpLrqFT7WA/2338276a-87f7-4a1e-f92a-776a18ee4200/width=768/2338276a-87f7-4a1e-f92a-776a18ee4200.jpeg',
'hash': 'U5Exz_00.8D$t89Z%M0100~VD*RktQxaIU~p',
'width': 768,
'height': 1368,
'nsfw': True,
'createdAt': '2023-02-14T10:05:11.498Z',
'postId': 60841,
'stats': {'cryCount': 0,
'laughCount': 0,
'likeCount': 26,
'dislikeCount': 0,
'heartCount': 50,
'commentCount': 4},
'meta': {'ENSD': '31337',
'Size': '512x912',
'seed': 3994946333,
'Model': 'AbyssOrangeMix2_sfw',
'steps': 20,
'prompt': '<lora:hiqcg_body-epoch-000004:0.5>, <lora:hiqcg_face-epoch-000004:0.4>, hiqcgbody, hiqcgface, 1girl, full body, standing, \ndetailed skin texture, detailed cloth texture, beautiful detailed face,\nmasterpiece, best quality, ultra detailed, 8k, intricate details,',
'sampler': 'DPM++ 2M Karras',
'cfgScale': 7,
'Clip skip': '2',
'resources': [{'hash': '038ba203d8',
'name': 'AbyssOrangeMix2_sfw',
'type': 'model'}],
'Model hash': '038ba203d8',
'Hires upscale': '1.5',
'Hires upscaler': 'Latent',
'negativePrompt': 'EasyNegative, extra fingers,fewer fingers, multiple girls, multiple views,',
'Denoising strength': '0.6'},
'username': 'NeoClassicalRibbon'},
{..},
..],
'metadata':{'totalItems': 327145}
}
```
|
bneel-work/Ubuntu-Kpis-Prompt-With-Time | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1464908
num_examples: 5063
download_size: 185966
dataset_size: 1464908
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zicsx/IndicTrans2-Hindi | ---
dataset_info:
features:
- name: english
dtype: string
- name: hindi
dtype: string
splits:
- name: train
num_bytes: 10285172169
num_examples: 39333242
download_size: 5052156809
dataset_size: 10285172169
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/mayer_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mayer/メイヤー/梅尔 (Arknights)
This is the dataset of mayer/メイヤー/梅尔 (Arknights), containing 123 images and their tags.
The core tags of this character are `ahoge, brown_hair, breasts, hair_between_eyes, brown_eyes, grey_hair, grey_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 123 | 164.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mayer_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 123 | 140.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mayer_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 275 | 265.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mayer_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mayer_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 35 |  |  |  |  |  | 1girl, black_sweater, ribbed_sweater, solo, orange_jacket, turtleneck_sweater, short_hair_with_long_locks, looking_at_viewer, long_sleeves, black_gloves, smile, simple_background, open_jacket, armband, white_background, black_pantyhose, closed_mouth, blush, medium_breasts, sidelocks |
| 1 | 6 |  |  |  |  |  | 1girl, looking_at_viewer, open_jacket, solo, blue_gloves, grey_jacket, short_hair_with_long_locks, black_footwear, boots, holding, long_sleeves, smile, blue_dress, blue_shirt, full_body, hair_ornament, open_mouth, thigh_strap, white_jacket |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_sweater | ribbed_sweater | solo | orange_jacket | turtleneck_sweater | short_hair_with_long_locks | looking_at_viewer | long_sleeves | black_gloves | smile | simple_background | open_jacket | armband | white_background | black_pantyhose | closed_mouth | blush | medium_breasts | sidelocks | blue_gloves | grey_jacket | black_footwear | boots | holding | blue_dress | blue_shirt | full_body | hair_ornament | open_mouth | thigh_strap | white_jacket |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------------|:-----------------|:-------|:----------------|:---------------------|:-----------------------------|:--------------------|:---------------|:---------------|:--------|:--------------------|:--------------|:----------|:-------------------|:------------------|:---------------|:--------|:-----------------|:------------|:--------------|:--------------|:-----------------|:--------|:----------|:-------------|:-------------|:------------|:----------------|:-------------|:--------------|:---------------|
| 0 | 35 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | | | X | | | X | X | X | | X | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
manu/wikisource_fr | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 11647349958
num_examples: 2567238
download_size: 7238737612
dataset_size: 11647349958
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "wikisource_fr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rangeli4/test_ds | ---
license: cc
---
|
saibo/bookcorpus_compact_1024 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2753205189
num_examples: 616051
download_size: 1603181006
dataset_size: 2753205189
size_categories:
- 100K<n<1M
---
# Dataset Card for "bookcorpus_compact_1024"
Num samples: 616,051
The number of tokens for each sequence is not exactly 1024, but all slightly shorter than 1024.
The sequences were built by merging sentences to the maximal length shorter than 1024 tokens.
Therefore, padding is necessary for batch processing.
```python
import time
from typing import List
from datasets import load_dataset, Dataset
from tqdm import tqdm
from transformers import AutoTokenizer
def batch_tokenize(texts: List[str], tokenizer, batch_size=1000):
start = time.time()
"""Tokenize the texts in batch"""
assert tokenizer.is_fast, "tokenizer must be fast tokenizer"
tokenized_texts = []
for i in tqdm(range(0, len(texts), batch_size)):
batch = texts[i:i + batch_size]
batch_encoding = tokenizer(batch)
tokenized_texts.extend(batch_encoding["input_ids"])
print(f"batch_tokenize time with bs={batch_size}: {time.time() - start}")
return tokenized_texts
class CompactText:
def __init__(self, tokenizer="gpt2", split="test", block_size=512):
self.block_size = block_size
self.tokenizer = AutoTokenizer.from_pretrained(tokenizer)
def compact_load(self, dataset_name: str, split: str):
dataset = load_dataset(dataset_name)[split]
batch_encoding = batch_tokenize(dataset["text"], self.tokenizer, batch_size=10000)
compact_texts = []
texts = dataset["text"]
total_num_tok = 0
tracker = []
i = 0
for j in tqdm(range(len(batch_encoding))):
total_num_tok += len(batch_encoding[j])
if total_num_tok >= self.block_size:
batch_sents = texts[i:j]
big_sent = " ".join(batch_sents)
compact_texts.append(big_sent)
tracker.append((i, j))
i = j
total_num_tok = 0
print(tracker)
# self.examples = compact_texts
compact_ds = Dataset.from_dict({"text": compact_texts})
return compact_ds
if __name__ == '__main__':
import argparse
parser = argparse.ArgumentParser()
parser.add_argument("-b", "--block-size", type=int, default=512)
args = parser.parse_args()
compactifier = CompactText(block_size=args.block_size)
dataset = compactifier.compact_load(dataset_name="saibo/bookcorpus_deduplicated", split="train")
dataset.push_to_hub(f"saibo/bookcorpus_compact_{args.block_size}")
```
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-d60b4e7e-7574886 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xtreme
eval_info:
task: entity_extraction
model: Ning-fish/xlm-roberta-base-finetuned-panx-de
metrics: []
dataset_name: xtreme
dataset_config: PAN-X.de
dataset_split: test
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: Ning-fish/xlm-roberta-base-finetuned-panx-de
* Dataset: xtreme
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
liaad/Harem | ---
license: mit
---
|
orgcatorg/hudson | ---
dataset_info:
features:
- name: content
dtype: string
- name: title
dtype: string
- name: source_link
dtype: string
- name: description
dtype: string
- name: date
dtype: string
- name: category
dtype: string
- name: image
dtype: string
splits:
- name: train
num_bytes: 1873296
num_examples: 303
download_size: 1085678
dataset_size: 1873296
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Limour/G2Retrieval | ---
license: cc-by-nc-sa-4.0
language:
- zh
---
[视觉小说](https://huggingface.co/datasets/Limour/b-corpus) 领域的 Retrieval 评价数据集。
# Leaderboard
## data_sample2k
+ https://www.kaggle.com/code/reginliu/g2retrieval
| Model | NDCG@3 | NDCG@10 | NDCG@50 | NDCG@100 | NDCG@200 |
|-------|---------|---------|---------|---------|---------|
| [acge_text_embedding](https://huggingface.co/aspire/acge_text_embedding) | 83.53±17.86 | 76.97±17.79 | 61.52±20.61 | 52.07±20.87 | 42.49±19.83 |
| [IYun-large-zh](https://huggingface.co/Erin/IYun-large-zh) | 80.53±20.53 | 71.40±20.87 | 52.93±21.96 | 43.40±20.72 | 34.88±18.50 |
| [bce-embedding-base_v1](https://huggingface.co/maidalun1020/bce-embedding-base_v1) | 77.08±23.44 | 68.39±22.61 | 51.95±22.85 | 43.36±21.51 | 35.31±19.09 |
| [Dmeta-embedding](https://huggingface.co/DMetaSoul/Dmeta-embedding) | 77.56±22.12 | 68.62±21.96 | 51.58±22.29 | 42.71±21.04 | 34.33±18.61 |
| random | 01.66±05.78 | 02.06±03.66 | 02.23±02.52 | 02.13±02.03 | 02.40±01.91 | |
toilaluan/nike_laion | ---
dataset_info:
features:
- name: image
dtype: image
- name: url
dtype: string
- name: caption
dtype: string
- name: id
dtype: int64
- name: similarity
dtype: float64
splits:
- name: train
num_bytes: 343403671.807
num_examples: 5117
download_size: 282913216
dataset_size: 343403671.807
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "nike_laion"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jaygala223/38-cloud-train-only-v3-with-NIR | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 1133909329.0
num_examples: 8400
download_size: 1130978486
dataset_size: 1133909329.0
---
# Dataset Card for "38-cloud-train-only-v3-with-nir"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
argilla/distilabel-capybara-kto-15k-binarized | ---
language:
- en
license: apache-2.0
size_categories:
- 1K<n<10K
task_categories:
- conversational
- question-answering
- text-generation
pretty_name: CapybaraDPO-7k
tags:
- Physics
- Biology
- Math
- Chemistry
- Culture
- Logic
- Roleplay
- rlaif
- rlhf
- kto
- distilabel
- synthetic
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
list:
- name: content
dtype: string
- name: role
dtype: string
- name: label
dtype: bool
- name: rating
dtype: int64
- name: model
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 129692808
num_examples: 15126
download_size: 42545061
dataset_size: 129692808
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Capybara-KTO 15K binarized
> A KTO signal transformed version of the highly loved [Capybara-DPO 7K binarized](https://huggingface.co/datasets/argilla/distilabel-capybara-dpo-7k-binarized), A DPO dataset built with [distilabel](https://github.com/argilla-io/distilabel) atop the awesome [LDJnr/Capybara](https://huggingface.co/datasets/LDJnr/Capybara)
> This is a preview version to collect feedback from the community. v2 will include the full base dataset and responses from more powerful models.
<div>
<img src="https://cdn-uploads.huggingface.co/production/uploads/60420dccc15e823a685f2b03/Vmr0FtTvnny6Snm-UDM_n.png">
</div>
<p align="center">
<a href="https://github.com/argilla-io/distilabel">
<img src="https://raw.githubusercontent.com/argilla-io/distilabel/main/docs/assets/distilabel-badge-light.png" alt="Built with Distilabel" width="200" height="32"/>
</a>
</p>
## Why KTO?
The [KTO paper](https://arxiv.org/abs/2402.01306) states:
- KTO matches or exceeds DPO performance at scales from 1B to 30B parameters.1 That is, taking a preference dataset of n DPO pairs and breaking it up into 2n examples for KTO can yield better generations, despite the model ostensibly learning from a weaker signal.
- KTO can handle extreme data imbalances, matching DPO performance while using up to 90% fewer desirable examples (i.e., examples of good generations). Its success thus cannot be ascribed to the alignment data being sourced from a preference dataset.
- When the pretrained model is sufficiently good, one can skip supervised finetuning and go straight to KTO without a loss in generation quality. In contrast, we find that without doing SFT first, DPO-aligned models are significantly worse at all scales.
## Reproduce KTO Transformation
Original [distilabel Capybara-DPO 7K binarized](https://huggingface.co/datasets/argilla/distilabel-capybara-dpo-7k-binarized)
<a target="_blank" href="https://colab.research.google.com/drive/1xmc2q966UrLoHwZ4g-2Wd9qKzQLF-IJm?usp=sharing">
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
</a> |
sam-mosaic/dolly_hhrlhf_yashgoenka-gorilla-16k | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 51256773.87977925
num_examples: 60310
- name: test
num_bytes: 16933404.269971937
num_examples: 15129
download_size: 29936297
dataset_size: 68190178.14975119
---
# Dataset Card for "dolly_hhrlhf_yashgoenka-gorilla-16k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heroza/isic2017_task3 | ---
dataset_info:
- config_name: sub1
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': combined
'1': seb
splits:
- name: train
num_bytes: 1354643486.0
num_examples: 2000
- name: validation
num_bytes: 869316023.0
num_examples: 150
- name: test
num_bytes: 5548533480.0
num_examples: 600
download_size: 12198304944
dataset_size: 7772492989.0
- config_name: sub2
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': combined
'1': mel
splits:
- name: train
num_bytes: 2290898224.0
num_examples: 2000
- name: validation
num_bytes: 869316023.0
num_examples: 150
- name: test
num_bytes: 5548533480.0
num_examples: 600
download_size: 12198307368
dataset_size: 8708747727.0
configs:
- config_name: sub1
data_files:
- split: train
path: sub1/train-*
- split: validation
path: sub1/validation-*
- split: test
path: sub1/test-*
- config_name: sub2
data_files:
- split: train
path: sub2/train-*
- split: validation
path: sub2/validation-*
- split: test
path: sub2/test-*
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-15000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1026188
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
DynamicSuperb/IntentClassification_FluentSpeechCommands | ---
dataset_info:
features:
- name: file
dtype: string
- name: speakerId
dtype: string
- name: transcription
dtype: string
- name: audio
dtype: audio
- name: action
dtype: string
- name: object
dtype: string
- name: location
dtype: string
- name: instruction
dtype: string
splits:
- name: test
num_bytes: 2508839169.15
num_examples: 30043
download_size: 1918599335
dataset_size: 2508839169.15
---
# Dataset Card for "intent_classification_fluent_speech_commands"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Charlonbh/vozlecos | ---
license: openrail
---
|
salma-remyx/ffmperative_sample_5k | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 1948807
num_examples: 5000
download_size: 599304
dataset_size: 1948807
---
# Dataset Card for "ffmperative_sample_5k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
spacelephant/simpleMix_v3 | ---
license: unknown
---
|
strickvl/isafpressreleasescomplete | ---
license: cc-by-nc-sa-4.0
task_categories:
- text-generation
task_ids:
- document-retrieval
pretty_name: ISAFpressreleasesComplete
language:
- en
size_categories:
- 10K<n<100K
annotations_creators:
- no-annotation
multilinguality:
- monolingual
source_datasets:
- extended|isafpressreleases
---
## ISAF Press Releases Complete Dataset Description
- **Homepage:** [N/A]
- **Repository:** [N/A]
- **Paper:** [A Knock on the Door: 22 Months of ISAF Press Releases](https://www.afghanistan-analysts.org/en/special-reports/a-knock-on-the-door-22-months-of-isaf-press-releases/)
- **Original Dataset:** [ISAF Press Releases Dataset](https://huggingface.co/datasets/strickvl/isafpressreleases)
- **Point of Contact:** Alex Strick van Linschoten ([@strickvl](https://huggingface.co/strickvl))
### Dataset Summary
The ISAF Press Releases Complete dataset is an extension of the original [ISAF Press Releases Dataset](https://huggingface.co/datasets/strickvl/isafpressreleases). It contains the raw HTML files of press releases issued by the International Security Assistance Force (ISAF) in Afghanistan, covering a broader period than the original dataset extending from 2009 until 2016. In addition to the HTML files, the dataset provides a Parquet file (`data/isafpressreleases-complete2024.parquet`) that contains all the data parsed from the HTML files and API requests. This Parquet file serves as the primary resource for researchers and organizations interested in using the dataset.
The dataset offers a comprehensive collection of press releases, enabling researchers and organizations to analyze and process the data according to their specific requirements. The HTML files are organized by year and month for archival purposes, while the Parquet file provides a structured and easily accessible format for data analysis.
### Supported Tasks and Leaderboards
- `document-retrieval`: The dataset can be used for document retrieval tasks, where the goal is to find relevant press releases based on specific queries or criteria. Researchers can utilize the Parquet file to develop and evaluate retrieval algorithms.
- `text-generation`: The press releases in the dataset can serve as a resource for text generation tasks, such as language modeling or summarization. The Parquet file provides a diverse collection of military-related text that can be used to train and test generative models.
### Languages
The press releases in the dataset are entirely in English. They contain military jargon and Afghanistan-specific places and context, which are explained in the associated research paper.
## Dataset Structure
### Data Instances
The primary data resource in the dataset is the Parquet file (`data/isafpressreleases-complete2024.parquet`), which contains all the data parsed from the HTML files and API requests. Each row in the Parquet file represents a single press release and includes relevant information extracted from the HTML files.
The dataset also includes the raw HTML files of ISAF press releases, organized by year and month in the `data` directory, for archival purposes.
### Data Fields
The Parquet file (`data/isafpressreleases-complete2024.parquet`) contains structured data with predefined fields extracted from the HTML files and API requests. Researchers and organizations can refer to the Parquet file schema to understand the available data fields and their types.
### Data Splits
The dataset is not split into predefined subsets. The entire dataset is provided as a single Parquet file (`data/isafpressreleases-complete2024.parquet`) for ease of use and analysis.
## Dataset Creation
### Curation Rationale
The ISAF Press Releases Complete dataset was created to provide researchers and organizations with access to a comprehensive collection of ISAF press releases in a structured and easily accessible format. The Parquet file (`data/isafpressreleases-complete2024.parquet`) serves as the primary resource, containing all the data parsed from the HTML files and API requests. This format enables efficient querying, analysis, and processing of the press release data.
The raw HTML files are also included in the dataset for archival purposes and to allow researchers to refer to the original source material if needed.
### Source Data
#### Initial Data Collection and Normalization
The HTML files were collected using the script provided in the `scripts/` folder of the dataset repository. The script requires a DVIDS API key from https://api.dvidshub.net/ to download the press releases. The downloaded HTML files were then organized by year and month in the `data` directory.
The data from the HTML files, along with additional information retrieved from API requests, was parsed and stored in the Parquet file (`data/isafpressreleases-complete2024.parquet`). This process normalized the data and provided a structured format for analysis and processing.
#### Who are the source language producers?
The press releases were written by the press office and media relations team of ISAF (International Security Assistance Force) and later NATO (North Atlantic Treaty Organization) during the specified period. They were created by human writers as official communications from ISAF/NATO.
### Annotations
The dataset does not include any annotations or labeling. It consists of the raw HTML files and the parsed data in the Parquet file.
### Personal and Sensitive Information
The dataset contains information mentioned in the original press releases, which were publicly issued by ISAF. However, as the dataset provides both the raw HTML files and the parsed data in the Parquet file, it is the responsibility of the researchers and organizations using the dataset to handle any personal or sensitive information appropriately and in compliance with relevant regulations and ethical guidelines.
## Considerations for Using the Data
### Social Impact of Dataset
The ISAF Press Releases Complete dataset provides an important historical record of ISAF's activities in Afghanistan during the specified period. It enables researchers, legal teams, and organizations to access and analyze the press release data, contributing to a better understanding of the events and their impact. The availability of this data is crucial for Afghan history and ensures that the information remains accessible for further research and analysis.
However, it is important to consider the potential biases and limitations of the dataset, as discussed in the following sections.
### Discussion of Biases
The dataset reflects the inherent biases and limitations of ISAF's presence and understanding of Afghanistan. The press releases were created by ISAF/NATO and may not provide a complete or unbiased account of the events. It is important to recognize that the dataset represents the perspective and narrative of ISAF/NATO and may not necessarily reflect the experiences or viewpoints of other stakeholders, particularly the Afghan population.
### Other Known Limitations
The dataset has several known limitations:
- Incomplete coverage: While the dataset aims to provide a comprehensive collection of ISAF press releases, there may be gaps or missing information due to the availability and accessibility of the source data.
- Potential inconsistencies: As the press releases were created by different individuals over an extended period, there may be inconsistencies in the style, format, or content of the data.
Users of the dataset should be aware of these limitations and consider them when working with the data.
## Additional Information
### Dataset Curators
The dataset was created by Alex Strick van Linschoten and Felix Kuehn as an extension to their original [ISAF Press Releases Dataset](https://huggingface.co/datasets/strickvl/isafpressreleases).
### Licensing Information
This dataset is licensed under the Creative Commons Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license. For more information, see [https://creativecommons.org/licenses/by-sa/4.0/](https://creativecommons.org/licenses/by-sa/4.0/). Access to the dataset is restricted to legitimate researchers and organizations. For more information about accessing and using the dataset, please contact the dataset curators.
### Citation Information
When using the ISAF Press Releases Complete dataset, please cite both the
original dataset and the associated research paper:
```
@dataset{strick_van_linschoten_kuehn_2023_isafpressreleasescomplete,
author = {Alex Strick van Linschoten and Felix Kuehn},
title = {ISAF Press Releases Complete},
year = {2023},
url = {https://huggingface.co/datasets/strickvl/isafpressreleasescomplete}
}
@article{strick_van_linschoten_kuehn_2011,
author = {Alex Strick van Linschoten and Felix Kuehn},
title = {A Knock on the Door: 22 Months of ISAF Press Releases},
journal = {Afghanistan Analysts Network},
year = {2011},
month = {October},
day = {12},
url = {https://www.afghanistan-analysts.org/en/special-reports/a-knock-on-the-door-22-months-of-isaf-press-releases/}
}
```
### Contributions
Many thanks to the [Afghanistan Analysts
Network](https://www.afghanistan-analysts.org/en) for funding the research and
supporting the creation of this dataset.
|
main-horse/ffv4_dataset_test | ---
license: openrail
dataset_info:
features:
- name: id
dtype: int32
- name: header
dtype: string
- name: story
dtype: string
splits:
- name: everything
num_bytes: 4112502210
num_examples: 52357
download_size: 2446111268
dataset_size: 4112502210
---
this is a testing dataset for future model testing. you should not use this (yet)
there are multiple datasets,
* `notebook_defaults`
* `notebook_defaults_ratio0.8_likes10`
you can load each like this:
```python
import datasets
# see FFV4.BUILDER_CONFIGS for all possible names
ds = datasets.load_dataset('./dataset_code.py', name='notebook_defaults_ratio0.8_likes10')
```
then use them like this
```python
ds_real = ds['everything'] # there is no such thing as a train/test split here
one_item = ds_real[0] # grab first story, and truncuate the text of it to first 1000 characters
one_item_truncuated = one_item | {'story': one_item['story'][:1000]}
print(ds)
print(one_item_truncuated)
```
this will show something vaguely useful
```python
DatasetDict({
everything: Dataset({
features: ['id', 'header', 'story'],
num_rows: 52357
})
})
{'id': 394130, 'header': '<|info|>\ntitle: Broken, But Getting Better\nauthor: Rose Quill\ntags: character:Tempest Shadow, character:Twilight Sparkle, genre:Slice of Life, series:My Little Pony: Friendship is Magic', 'story': "=== Broken ===\nI stared at the paper, a pencil in my mouth as I considered the next words. I was not the most well read of ponies, having always taken the stance that actions speak louder, but I felt that this time needed some words to explain. I scanned what I had already written to try and jog my vocabulary.\nPrincess Twilight,\nBy the time you read this, I'll have left. I know you offered me your friendship, but I think it would do me well to be apart from other ponies for a few days…give or take a week.\nThis is not running away, no. Far from it. I have been away from my kind for so long I fear I have forgotten what it means to even be Equestrian. I need time to observe with no metric standing against me.\nI sighed and glanced out the window at the town of Ponyville, the town square filled with banners and other evidence of an upcoming party. In the glass of the portal, I saw the snapped stub of my horn, and I felt the dull pain that I had lived with for most of my life.\nI reached up a"}
```
|
OpenShape/openshape-training-data | ---
license: openrail
---
|
mteb-pt/askubuntudupquestions | ---
configs:
- config_name: pt-br
data_files:
- split: test
path: test*
--- |
Niwat/test1 | ---
license: wtfpl
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 23665
num_examples: 10
download_size: 27131
dataset_size: 23665
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hippocrates/CitationGPTv12345_train | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 365813520
num_examples: 99360
- name: valid
num_bytes: 47375754
num_examples: 12760
- name: test
num_bytes: 42198711
num_examples: 11615
download_size: 175738218
dataset_size: 455387985
---
# Dataset Card for "CitationGPTv12345_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_migtissera__Synthia-70B | ---
pretty_name: Evaluation run of migtissera/Synthia-70B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [migtissera/Synthia-70B](https://huggingface.co/migtissera/Synthia-70B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__Synthia-70B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T22:51:19.251335](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-70B/blob/main/results_2023-10-15T22-51-19.251335.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.15100671140939598,\n\
\ \"em_stderr\": 0.0036668226447704277,\n \"f1\": 0.21747168624161078,\n\
\ \"f1_stderr\": 0.0037439821226941702,\n \"acc\": 0.5752480443377197,\n\
\ \"acc_stderr\": 0.011586688610663485\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.15100671140939598,\n \"em_stderr\": 0.0036668226447704277,\n\
\ \"f1\": 0.21747168624161078,\n \"f1_stderr\": 0.0037439821226941702\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.31387414708112205,\n \
\ \"acc_stderr\": 0.012782681251053207\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8366219415943172,\n \"acc_stderr\": 0.010390695970273763\n\
\ }\n}\n```"
repo_url: https://huggingface.co/migtissera/Synthia-70B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|arc:challenge|25_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_15T22_51_19.251335
path:
- '**/details_harness|drop|3_2023-10-15T22-51-19.251335.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T22-51-19.251335.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_15T22_51_19.251335
path:
- '**/details_harness|gsm8k|5_2023-10-15T22-51-19.251335.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T22-51-19.251335.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hellaswag|10_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T05:19:54.133935.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T05_19_54.133935
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T05:19:54.133935.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T05:19:54.133935.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_15T22_51_19.251335
path:
- '**/details_harness|winogrande|5_2023-10-15T22-51-19.251335.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T22-51-19.251335.parquet'
- config_name: results
data_files:
- split: 2023_10_15T22_51_19.251335
path:
- results_2023-10-15T22-51-19.251335.parquet
- split: latest
path:
- results_2023-10-15T22-51-19.251335.parquet
---
# Dataset Card for Evaluation run of migtissera/Synthia-70B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/migtissera/Synthia-70B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [migtissera/Synthia-70B](https://huggingface.co/migtissera/Synthia-70B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_migtissera__Synthia-70B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T22:51:19.251335](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-70B/blob/main/results_2023-10-15T22-51-19.251335.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.15100671140939598,
"em_stderr": 0.0036668226447704277,
"f1": 0.21747168624161078,
"f1_stderr": 0.0037439821226941702,
"acc": 0.5752480443377197,
"acc_stderr": 0.011586688610663485
},
"harness|drop|3": {
"em": 0.15100671140939598,
"em_stderr": 0.0036668226447704277,
"f1": 0.21747168624161078,
"f1_stderr": 0.0037439821226941702
},
"harness|gsm8k|5": {
"acc": 0.31387414708112205,
"acc_stderr": 0.012782681251053207
},
"harness|winogrande|5": {
"acc": 0.8366219415943172,
"acc_stderr": 0.010390695970273763
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Belalallalavagna/Emilianoscemoxsempre | ---
license: unknown
---
|
Francesco/construction-safety-gsnvb | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int32
- name: height
dtype: int32
- name: objects
sequence:
- name: id
dtype: int64
- name: area
dtype: int64
- name: bbox
sequence: float32
length: 4
- name: category
dtype:
class_label:
names:
'0': construction-safety
'1': helmet
'2': no-helmet
'3': no-vest
'4': person
'5': vest
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- object-detection
task_ids: []
pretty_name: construction-safety-gsnvb
tags:
- rf100
---
# Dataset Card for construction-safety-gsnvb
** The original COCO dataset is stored at `dataset.tar.gz`**
## Dataset Description
- **Homepage:** https://universe.roboflow.com/object-detection/construction-safety-gsnvb
- **Point of Contact:** francesco.zuppichini@gmail.com
### Dataset Summary
construction-safety-gsnvb
### Supported Tasks and Leaderboards
- `object-detection`: The dataset can be used to train a model for Object Detection.
### Languages
English
## Dataset Structure
### Data Instances
A data point comprises an image and its object annotations.
```
{
'image_id': 15,
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=640x640 at 0x2373B065C18>,
'width': 964043,
'height': 640,
'objects': {
'id': [114, 115, 116, 117],
'area': [3796, 1596, 152768, 81002],
'bbox': [
[302.0, 109.0, 73.0, 52.0],
[810.0, 100.0, 57.0, 28.0],
[160.0, 31.0, 248.0, 616.0],
[741.0, 68.0, 202.0, 401.0]
],
'category': [4, 4, 0, 0]
}
}
```
### Data Fields
- `image`: the image id
- `image`: `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `width`: the image width
- `height`: the image height
- `objects`: a dictionary containing bounding box metadata for the objects present on the image
- `id`: the annotation id
- `area`: the area of the bounding box
- `bbox`: the object's bounding box (in the [coco](https://albumentations.ai/docs/getting_started/bounding_boxes_augmentation/#coco) format)
- `category`: the object's category.
#### Who are the annotators?
Annotators are Roboflow users
## Additional Information
### Licensing Information
See original homepage https://universe.roboflow.com/object-detection/construction-safety-gsnvb
### Citation Information
```
@misc{ construction-safety-gsnvb,
title = { construction safety gsnvb Dataset },
type = { Open Source Dataset },
author = { Roboflow 100 },
howpublished = { \url{ https://universe.roboflow.com/object-detection/construction-safety-gsnvb } },
url = { https://universe.roboflow.com/object-detection/construction-safety-gsnvb },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { nov },
note = { visited on 2023-03-29 },
}"
```
### Contributions
Thanks to [@mariosasko](https://github.com/mariosasko) for adding this dataset. |
takojunior/llama_2_finetune | ---
license: apache-2.0
---
|
goodfellowliu/Set5 | ---
license: openrail
language:
- en
--- |
hhhwmws/dingchunqiu | ---
license: cc-by-4.0
task_categories:
- text-generation
language:
- zh
size_categories:
- 1K<n<10K
---
支持ChatHaruhi2 的丁春秋数据,可以使用如下方式调用
```python
from chatharuhi import ChatHaruhi
chatbot = ChatHaruhi( role_from_hf = 'hhhwmws/dingchunqiu', \
llm = 'openai')
response = chatbot.chat(role='萧峰', text = '丁春秋!')
print(response)
```
上传者: 米唯实
更具体的信息,见 [ChatHaruhi](https://github.com/LC1332/Chat-Haruhi-Suzumiya)
欢迎加入我们的 [众筹角色创建项目](https://github.com/LC1332/Chat-Haruhi-Suzumiya/tree/main/characters/novel_collecting)
### Citation引用
Please cite the repo if you use the data or code in this repo.
```
@misc{li2023chatharuhi,
title={ChatHaruhi: Reviving Anime Character in Reality via Large Language Model},
author={Cheng Li and Ziang Leng and Chenxi Yan and Junyi Shen and Hao Wang and Weishi MI and Yaying Fei and Xiaoyang Feng and Song Yan and HaoSheng Wang and Linkang Zhan and Yaokai Jia and Pingyu Wu and Haozhen Sun},
year={2023},
eprint={2308.09597},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
mask-distilled-libri-one-sec-cv12/chunk_7 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: logits
sequence: float32
splits:
- name: train
num_bytes: 240411378.40433368
num_examples: 7499
download_size: 181447002
dataset_size: 240411378.40433368
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mfidabel/sam-coyo-2k | ---
license: mit
dataset_info:
features:
- name: image
dtype: image
- name: conditioning_image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1717753206.88
num_examples: 2240
download_size: 1815819421
dataset_size: 1717753206.88
---
|
landersanmi/BilbaoCaptions2 | ---
dataset_info:
features:
- name: caption
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 3185702553.866935
num_examples: 3781
- name: test
num_bytes: 797057555.1330653
num_examples: 946
download_size: 3952516923
dataset_size: 3982760109.0
---
# Dataset Card for "BilbaoCaptions2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Smuggling1710/test | ---
license: apache-2.0
---
|
louisbertson/mos_fr_dataset | ---
license: mit
language:
- fr
tags:
- mossi
- moore
- Burkina Faso
pretty_name: Mos to Fr
size_categories:
- 10K<n<100K
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mctaco-modified4 | ---
pretty_name: Evaluation run of Charlie911/vicuna-7b-v1.5-lora-mctaco-modified4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Charlie911/vicuna-7b-v1.5-lora-mctaco-modified4](https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-mctaco-modified4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mctaco-modified4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-23T10:09:09.796535](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mctaco-modified4/blob/main/results_2023-10-23T10-09-09.796535.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.22713926174496643,\n\
\ \"em_stderr\": 0.004290781297690954,\n \"f1\": 0.2716809983221478,\n\
\ \"f1_stderr\": 0.004317738520761278,\n \"acc\": 0.33976344759040505,\n\
\ \"acc_stderr\": 0.006940874719140418\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.22713926174496643,\n \"em_stderr\": 0.004290781297690954,\n\
\ \"f1\": 0.2716809983221478,\n \"f1_stderr\": 0.004317738520761278\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \
\ \"acc_stderr\": 0.0007581501137225188\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6787687450670876,\n \"acc_stderr\": 0.013123599324558317\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-mctaco-modified4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_23T10_09_09.796535
path:
- '**/details_harness|drop|3_2023-10-23T10-09-09.796535.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-23T10-09-09.796535.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_23T10_09_09.796535
path:
- '**/details_harness|gsm8k|5_2023-10-23T10-09-09.796535.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-23T10-09-09.796535.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-32-59.033048.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-32-59.033048.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-32-59.033048.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_23T10_09_09.796535
path:
- '**/details_harness|winogrande|5_2023-10-23T10-09-09.796535.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-23T10-09-09.796535.parquet'
- config_name: results
data_files:
- split: 2023_09_11T17_32_59.033048
path:
- results_2023-09-11T17-32-59.033048.parquet
- split: 2023_10_23T10_09_09.796535
path:
- results_2023-10-23T10-09-09.796535.parquet
- split: latest
path:
- results_2023-10-23T10-09-09.796535.parquet
---
# Dataset Card for Evaluation run of Charlie911/vicuna-7b-v1.5-lora-mctaco-modified4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-mctaco-modified4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Charlie911/vicuna-7b-v1.5-lora-mctaco-modified4](https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-mctaco-modified4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mctaco-modified4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T10:09:09.796535](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mctaco-modified4/blob/main/results_2023-10-23T10-09-09.796535.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.22713926174496643,
"em_stderr": 0.004290781297690954,
"f1": 0.2716809983221478,
"f1_stderr": 0.004317738520761278,
"acc": 0.33976344759040505,
"acc_stderr": 0.006940874719140418
},
"harness|drop|3": {
"em": 0.22713926174496643,
"em_stderr": 0.004290781297690954,
"f1": 0.2716809983221478,
"f1_stderr": 0.004317738520761278
},
"harness|gsm8k|5": {
"acc": 0.000758150113722517,
"acc_stderr": 0.0007581501137225188
},
"harness|winogrande|5": {
"acc": 0.6787687450670876,
"acc_stderr": 0.013123599324558317
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
GGital/Signal_Test02 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
'2': '2'
'3': '3'
'4': '4'
'5': '5'
'6': '6'
splits:
- name: train
num_bytes: 11566389.0
num_examples: 647
download_size: 11525815
dataset_size: 11566389.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AdapterOcean/oasst_top1_standardized_cluster_4_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4249868
num_examples: 1833
download_size: 2302180
dataset_size: 4249868
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "oasst_top1_standardized_cluster_4_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yan-ds/AI-sharing-test | ---
license: apache-2.0
---
|
sngsfydy/DR_Grading_413_103 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
'2': '2'
'3': '3'
'4': '4'
splits:
- name: train
num_bytes: 261501746.0
num_examples: 413
- name: test
num_bytes: 64805638.0
num_examples: 103
download_size: 316625605
dataset_size: 326307384.0
---
# Dataset Card for "DR_Grading_413_103"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Devio__test-22B | ---
pretty_name: Evaluation run of Devio/test-22B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Devio/test-22B](https://huggingface.co/Devio/test-22B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Devio__test-22B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-16T03:23:54.397499](https://huggingface.co/datasets/open-llm-leaderboard/details_Devio__test-22B/blob/main/results_2023-10-16T03-23-54.397499.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002936241610738255,\n\
\ \"em_stderr\": 0.0005541113054709917,\n \"f1\": 0.03323510906040272,\n\
\ \"f1_stderr\": 0.0011026689087019657,\n \"acc\": 0.2903720919378185,\n\
\ \"acc_stderr\": 0.0077888780496033275\n },\n \"harness|drop|3\":\
\ {\n \"em\": 0.002936241610738255,\n \"em_stderr\": 0.0005541113054709917,\n\
\ \"f1\": 0.03323510906040272,\n \"f1_stderr\": 0.0011026689087019657\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0037907505686125853,\n \
\ \"acc_stderr\": 0.0016927007401501832\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5769534333070244,\n \"acc_stderr\": 0.013885055359056472\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Devio/test-22B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|arc:challenge|25_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_16T03_23_54.397499
path:
- '**/details_harness|drop|3_2023-10-16T03-23-54.397499.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-16T03-23-54.397499.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_16T03_23_54.397499
path:
- '**/details_harness|gsm8k|5_2023-10-16T03-23-54.397499.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-16T03-23-54.397499.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hellaswag|10_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T01:38:52.675251.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T01:38:52.675251.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T01:38:52.675251.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_16T03_23_54.397499
path:
- '**/details_harness|winogrande|5_2023-10-16T03-23-54.397499.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-16T03-23-54.397499.parquet'
- config_name: results
data_files:
- split: 2023_09_02T01_38_52.675251
path:
- results_2023-09-02T01:38:52.675251.parquet
- split: 2023_10_16T03_23_54.397499
path:
- results_2023-10-16T03-23-54.397499.parquet
- split: latest
path:
- results_2023-10-16T03-23-54.397499.parquet
---
# Dataset Card for Evaluation run of Devio/test-22B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Devio/test-22B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Devio/test-22B](https://huggingface.co/Devio/test-22B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Devio__test-22B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-16T03:23:54.397499](https://huggingface.co/datasets/open-llm-leaderboard/details_Devio__test-22B/blob/main/results_2023-10-16T03-23-54.397499.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.002936241610738255,
"em_stderr": 0.0005541113054709917,
"f1": 0.03323510906040272,
"f1_stderr": 0.0011026689087019657,
"acc": 0.2903720919378185,
"acc_stderr": 0.0077888780496033275
},
"harness|drop|3": {
"em": 0.002936241610738255,
"em_stderr": 0.0005541113054709917,
"f1": 0.03323510906040272,
"f1_stderr": 0.0011026689087019657
},
"harness|gsm8k|5": {
"acc": 0.0037907505686125853,
"acc_stderr": 0.0016927007401501832
},
"harness|winogrande|5": {
"acc": 0.5769534333070244,
"acc_stderr": 0.013885055359056472
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
rocknrj/Test_ENISA_EXTRACTED | ---
license: other
---
|
AdapterOcean/med_alpaca_standardized_cluster_55 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 81675669
num_examples: 8149
download_size: 24284093
dataset_size: 81675669
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_55"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ABX-AI__Silver-Sun-11B | ---
pretty_name: Evaluation run of ABX-AI/Silver-Sun-11B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ABX-AI/Silver-Sun-11B](https://huggingface.co/ABX-AI/Silver-Sun-11B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ABX-AI__Silver-Sun-11B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-09T11:23:48.663620](https://huggingface.co/datasets/open-llm-leaderboard/details_ABX-AI__Silver-Sun-11B/blob/main/results_2024-04-09T11-23-48.663620.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6613299971122604,\n\
\ \"acc_stderr\": 0.03117186211934933,\n \"acc_norm\": 0.6730584240663938,\n\
\ \"acc_norm_stderr\": 0.03199188346673098,\n \"mc1\": 0.47613219094247244,\n\
\ \"mc1_stderr\": 0.017483547156961578,\n \"mc2\": 0.618855922705881,\n\
\ \"mc2_stderr\": 0.015586954390037554\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6680887372013652,\n \"acc_stderr\": 0.013760988200880533,\n\
\ \"acc_norm\": 0.6979522184300341,\n \"acc_norm_stderr\": 0.013417519144716413\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.692989444333798,\n\
\ \"acc_stderr\": 0.004603111343213067,\n \"acc_norm\": 0.8791077474606652,\n\
\ \"acc_norm_stderr\": 0.0032533576201717973\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7697368421052632,\n \"acc_stderr\": 0.03426059424403165,\n\
\ \"acc_norm\": 0.7697368421052632,\n \"acc_norm_stderr\": 0.03426059424403165\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n\
\ \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249386,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249386\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.034765901043041336,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.034765901043041336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5350877192982456,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.5350877192982456,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.04043461861916747,\n\
\ \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.04043461861916747\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.025733641991838987,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.025733641991838987\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8451612903225807,\n \"acc_stderr\": 0.020579287326583227,\n \"\
acc_norm\": 0.8451612903225807,\n \"acc_norm_stderr\": 0.020579287326583227\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656208,\n\
\ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656208\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8838383838383839,\n \"acc_stderr\": 0.02282888177524938,\n \"\
acc_norm\": 0.8838383838383839,\n \"acc_norm_stderr\": 0.02282888177524938\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657262,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657262\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7100840336134454,\n \"acc_stderr\": 0.029472485833136098,\n\
\ \"acc_norm\": 0.7100840336134454,\n \"acc_norm_stderr\": 0.029472485833136098\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"\
acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5833333333333334,\n \"acc_stderr\": 0.033622774366080424,\n \"\
acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.033622774366080424\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8676470588235294,\n \"acc_stderr\": 0.02378429752091886,\n \"\
acc_norm\": 0.8676470588235294,\n \"acc_norm_stderr\": 0.02378429752091886\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8818565400843882,\n \"acc_stderr\": 0.021011052659878463,\n \
\ \"acc_norm\": 0.8818565400843882,\n \"acc_norm_stderr\": 0.021011052659878463\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573973,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573973\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8058748403575989,\n\
\ \"acc_stderr\": 0.014143970276657567,\n \"acc_norm\": 0.8058748403575989,\n\
\ \"acc_norm_stderr\": 0.014143970276657567\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069363,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069363\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.49385474860335193,\n\
\ \"acc_stderr\": 0.016721238483631412,\n \"acc_norm\": 0.49385474860335193,\n\
\ \"acc_norm_stderr\": 0.016721238483631412\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.02440439492808787,\n\
\ \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.02440439492808787\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.02540383297817962,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.02540383297817962\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.023016705640262196,\n\
\ \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.023016705640262196\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5141843971631206,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.5141843971631206,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.500651890482399,\n\
\ \"acc_stderr\": 0.01277022525225556,\n \"acc_norm\": 0.500651890482399,\n\
\ \"acc_norm_stderr\": 0.01277022525225556\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7610294117647058,\n \"acc_stderr\": 0.025905280644893006,\n\
\ \"acc_norm\": 0.7610294117647058,\n \"acc_norm_stderr\": 0.025905280644893006\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.696078431372549,\n \"acc_stderr\": 0.018607552131279827,\n \
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.018607552131279827\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142777,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142777\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n\
\ \"acc_stderr\": 0.024112678240900798,\n \"acc_norm\": 0.8656716417910447,\n\
\ \"acc_norm_stderr\": 0.024112678240900798\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.93,\n \"acc_stderr\": 0.025643239997624294,\n \
\ \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.025643239997624294\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\
\ \"acc_stderr\": 0.03851597683718533,\n \"acc_norm\": 0.572289156626506,\n\
\ \"acc_norm_stderr\": 0.03851597683718533\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.47613219094247244,\n\
\ \"mc1_stderr\": 0.017483547156961578,\n \"mc2\": 0.618855922705881,\n\
\ \"mc2_stderr\": 0.015586954390037554\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8413575374901342,\n \"acc_stderr\": 0.010267936243028214\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.00530705079605762,\n \
\ \"acc_stderr\": 0.0020013057209480405\n }\n}\n```"
repo_url: https://huggingface.co/ABX-AI/Silver-Sun-11B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|arc:challenge|25_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|gsm8k|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hellaswag|10_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|winogrande|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-09T11-23-48.663620.parquet'
- config_name: results
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- results_2024-04-09T11-23-48.663620.parquet
- split: latest
path:
- results_2024-04-09T11-23-48.663620.parquet
---
# Dataset Card for Evaluation run of ABX-AI/Silver-Sun-11B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ABX-AI/Silver-Sun-11B](https://huggingface.co/ABX-AI/Silver-Sun-11B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ABX-AI__Silver-Sun-11B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-09T11:23:48.663620](https://huggingface.co/datasets/open-llm-leaderboard/details_ABX-AI__Silver-Sun-11B/blob/main/results_2024-04-09T11-23-48.663620.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6613299971122604,
"acc_stderr": 0.03117186211934933,
"acc_norm": 0.6730584240663938,
"acc_norm_stderr": 0.03199188346673098,
"mc1": 0.47613219094247244,
"mc1_stderr": 0.017483547156961578,
"mc2": 0.618855922705881,
"mc2_stderr": 0.015586954390037554
},
"harness|arc:challenge|25": {
"acc": 0.6680887372013652,
"acc_stderr": 0.013760988200880533,
"acc_norm": 0.6979522184300341,
"acc_norm_stderr": 0.013417519144716413
},
"harness|hellaswag|10": {
"acc": 0.692989444333798,
"acc_stderr": 0.004603111343213067,
"acc_norm": 0.8791077474606652,
"acc_norm_stderr": 0.0032533576201717973
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7697368421052632,
"acc_stderr": 0.03426059424403165,
"acc_norm": 0.7697368421052632,
"acc_norm_stderr": 0.03426059424403165
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249386,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249386
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.034765901043041336,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.034765901043041336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5350877192982456,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.5350877192982456,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.04043461861916747,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.04043461861916747
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.025733641991838987,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.025733641991838987
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8451612903225807,
"acc_stderr": 0.020579287326583227,
"acc_norm": 0.8451612903225807,
"acc_norm_stderr": 0.020579287326583227
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656208,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656208
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8838383838383839,
"acc_stderr": 0.02282888177524938,
"acc_norm": 0.8838383838383839,
"acc_norm_stderr": 0.02282888177524938
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657262,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657262
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7100840336134454,
"acc_stderr": 0.029472485833136098,
"acc_norm": 0.7100840336134454,
"acc_norm_stderr": 0.029472485833136098
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660834,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660834
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.033622774366080424,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.033622774366080424
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8676470588235294,
"acc_stderr": 0.02378429752091886,
"acc_norm": 0.8676470588235294,
"acc_norm_stderr": 0.02378429752091886
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8818565400843882,
"acc_stderr": 0.021011052659878463,
"acc_norm": 0.8818565400843882,
"acc_norm_stderr": 0.021011052659878463
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699803,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699803
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573973,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573973
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8058748403575989,
"acc_stderr": 0.014143970276657567,
"acc_norm": 0.8058748403575989,
"acc_norm_stderr": 0.014143970276657567
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069363,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.49385474860335193,
"acc_stderr": 0.016721238483631412,
"acc_norm": 0.49385474860335193,
"acc_norm_stderr": 0.016721238483631412
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.02440439492808787,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.02440439492808787
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.02540383297817962,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.02540383297817962
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7808641975308642,
"acc_stderr": 0.023016705640262196,
"acc_norm": 0.7808641975308642,
"acc_norm_stderr": 0.023016705640262196
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5141843971631206,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.5141843971631206,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.500651890482399,
"acc_stderr": 0.01277022525225556,
"acc_norm": 0.500651890482399,
"acc_norm_stderr": 0.01277022525225556
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7610294117647058,
"acc_stderr": 0.025905280644893006,
"acc_norm": 0.7610294117647058,
"acc_norm_stderr": 0.025905280644893006
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.018607552131279827,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.018607552131279827
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142777,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142777
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.024112678240900798,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.024112678240900798
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.025643239997624294,
"acc_norm": 0.93,
"acc_norm_stderr": 0.025643239997624294
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.03851597683718533,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.03851597683718533
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.47613219094247244,
"mc1_stderr": 0.017483547156961578,
"mc2": 0.618855922705881,
"mc2_stderr": 0.015586954390037554
},
"harness|winogrande|5": {
"acc": 0.8413575374901342,
"acc_stderr": 0.010267936243028214
},
"harness|gsm8k|5": {
"acc": 0.00530705079605762,
"acc_stderr": 0.0020013057209480405
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ParityError/ControlNet-Shadows | ---
dataset_info:
features:
- name: frame
dtype: string
- name: target
dtype: image
- name: shadow
dtype: image
- name: position
dtype: string
- name: heading
dtype: string
- name: direction
dtype: string
- name: elevation
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 2308840037.0
num_examples: 3000
download_size: 2227889206
dataset_size: 2308840037.0
---
# Dataset Card for "Shadow-Dataset-ControlNet"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alexrods/mini_car_bikes_detection | ---
license: other
---
|
Labagaite/StableCascade_Lora_Training_sample | ---
license: gpl-3.0
---
|
distilled-from-one-sec-cv12/chunk_85 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1447896292
num_examples: 282131
download_size: 1479545130
dataset_size: 1447896292
---
# Dataset Card for "chunk_85"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EAST/autotrain-data-Rule | ---
language:
- zh
task_categories:
- text-classification
---
# AutoTrain Dataset for project: Rule
## Dataset Descritpion
This dataset has been automatically processed by AutoTrain for project Rule.
### Languages
The BCP-47 code for the dataset's language is zh.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "\u672c\u516c\u53f8\u4f1a\u5728\u60a8\u767b\u5f55\u53ca\u7248\u672c\u66f4\u65b0\u65f6\u4ee5\u63a8\u9001\u901a\u77e5\u3001\u5f39\u6846\u7684\u5f62\u5f0f\u5411\u60a8\u5c55\u793a\u53d8\u66f4\u540e\u7684\u9690\u79c1\u653f\u7b56",
"target": 1
},
{
"text": "\u6211\u4eec\u53ef\u80fd\u9002\u65f6\u4f1a\u5bf9\u672c\u9690\u79c1\u6743\u653f\u7b56\u8fdb\u884c\u8c03\u6574\u6216\u53d8\u66f4\uff0c\u672c\u9690\u79c1\u6743\u653f\u7b56\u7684\u4efb\u4f55\u66f4\u65b0\u5c06\u4ee5\u6807\u6ce8\u66f4\u65b0\u65f6\u95f4\u7684\u65b9\u5f0f\u516c\u5e03\u5728\u6211\u4eec\u7f51\u7ad9\u4e0a\uff0c\u9664\u6cd5\u5f8b\u6cd5\u89c4\u6216\u76d1\u7ba1\u89c4\u5b9a\u53e6\u6709\u5f3a\u5236\u6027\u89c4\u5b9a\u5916\uff0c\u7ecf\u8c03\u6574\u6216\u53d8\u66f4\u7684\u5185\u5bb9\u4e00\u7ecf\u901a\u77e5\u6216\u516c\u5e03\u540e\u76847\u65e5\u540e\u751f\u6548",
"target": 0
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"target": "ClassLabel(num_classes=2, names=['0', '1'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 70 |
| valid | 19 |
|
CyberHarem/blucher_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of blucher/ブリュッヒャー/布吕歇尔 (Azur Lane)
This is the dataset of blucher/ブリュッヒャー/布吕歇尔 (Azur Lane), containing 40 images and their tags.
The core tags of this character are `long_hair, blonde_hair, red_eyes, breasts, ahoge, bangs, twintails, fang, large_breasts, skin_fang`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 40 | 60.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blucher_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 40 | 31.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blucher_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 104 | 72.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blucher_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 40 | 52.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blucher_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 104 | 104.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blucher_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/blucher_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 |  |  |  |  |  | smile, 1girl, looking_at_viewer, solo, open_mouth, blush, black_gloves, red_scarf, red_skirt, black_thighhighs, fingerless_gloves, white_background, hair_between_eyes, plaid_skirt, simple_background, pleated_skirt |
| 1 | 7 |  |  |  |  |  | 1girl, bodysuit, goggles_on_head, looking_at_viewer, smile, solo, ass, fake_tail, long_sleeves, official_alternate_costume, rabbit_tail, sideboob, cropped_jacket, open_mouth, white_jacket, bandaid_on_face, blush, from_behind, blue_sky, day, full_body, medium_breasts, outdoors, shoes, snow, white_gloves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | smile | 1girl | looking_at_viewer | solo | open_mouth | blush | black_gloves | red_scarf | red_skirt | black_thighhighs | fingerless_gloves | white_background | hair_between_eyes | plaid_skirt | simple_background | pleated_skirt | bodysuit | goggles_on_head | ass | fake_tail | long_sleeves | official_alternate_costume | rabbit_tail | sideboob | cropped_jacket | white_jacket | bandaid_on_face | from_behind | blue_sky | day | full_body | medium_breasts | outdoors | shoes | snow | white_gloves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:-------|:-------------|:--------|:---------------|:------------|:------------|:-------------------|:--------------------|:-------------------|:--------------------|:--------------|:--------------------|:----------------|:-----------|:------------------|:------|:------------|:---------------|:-----------------------------|:--------------|:-----------|:-----------------|:---------------|:------------------|:--------------|:-----------|:------|:------------|:-----------------|:-----------|:--------|:-------|:---------------|
| 0 | 20 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
fsuarez/autotrain-data-logo_identifier_v4_short | ---
task_categories:
- image-classification
---
# AutoTrain Dataset for project: logo_identifier_v4_short
## Dataset Description
This dataset has been automatically processed by AutoTrain for project logo_identifier_v4_short.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<128x128 RGB PIL image>",
"target": 98
},
{
"image": "<100x100 RGB PIL image>",
"target": 99
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(names=['20thTelevision', '3M', '7Eleven', 'Acer', 'AmericanExpress', 'Amul', 'Anthem', 'ApolloHospitals', 'Apple', 'Armani', 'Asahi', 'Asus', 'Atari', 'Audi', 'Avon', 'Booking', 'Bosch', 'Bridgestone', 'British Airways', 'Budweiser', 'Burberry', 'BurgerKing', 'BuzzFeed', 'Canon', 'CocaColaZero', 'Coleman', 'Coles', 'Converse', 'CornFlakes', 'Corona', 'CostcoWholesale', 'Crayola', 'Credit Agricole', 'Crocs', 'Crunchyroll', 'Ctrip', 'Dropbox', 'Ducati', 'DunkinDonuts', 'Duracell', 'Dyson', 'Ethereum', 'ExxonMobil', 'FoxNews', 'FreddieMac', 'Fujitsu', 'Goodyear', 'Grubhub', 'Gucci', 'Huawei', 'Hudson Bay Company', 'HugoBoss', 'Hulu', 'Hyundai', 'Instagram', 'Intel', 'John Lewis & Partners', 'Johnson&Johnson', 'Kingston', 'LouisVuitton', 'Lowes', 'Lufthansa', 'Lululemon', 'Luxottica', 'MorganStanley', 'Motorola', 'MountainDew', 'Moutai', 'Movistar', 'Msci', 'Muji', 'Nike', 'Nissan', 'Nokia', 'Nvidia', 'Orange', 'Oreo', 'Porsche', 'Power China', 'Prada', 'Pringles', 'Publix', 'Puma', 'Purina', 'PwC', 'Qualcomm', 'Rolex', 'Rolls-Royce', 'RoyalCaribbean', 'Spotify', 'Sprite', 'Starbucks', 'StateBankofIndia', 'StateGrid', 'Subaru', 'Subway', 'SumitomoGroup', 'Suning', 'Supreme', 'Suzuki', 'Total SA', 'TotalEnergies', 'Toyota', 'TripAdvisor', 'Twitch', 'Twitter', 'UnitedHealthCare', 'Universal', 'Volkswagen', 'Volvo', 'Wikipedia', 'Wipro', 'Wuliangye', 'Xiaomi', 'Youtube', 'Zoom', 'hennessy', 'iHeartRadio', 'koolAid'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 6884 |
| valid | 1786 |
|
acool/ad_micro_synth_cube_stick | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 45461814.0
num_examples: 600
download_size: 45437186
dataset_size: 45461814.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ad_micro_synth_cube_stick"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
presencesw/dataset3_translated | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: references
sequence: string
- name: question_vi
dtype: string
- name: answer_vi
dtype: string
- name: references_vi
sequence: string
splits:
- name: train
num_bytes: 55148862
num_examples: 9000
download_size: 28244205
dataset_size: 55148862
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
RikoteMaster/llama2_classifying_and_explainning_v3 | ---
dataset_info:
features:
- name: Explanation
dtype: string
- name: Text_processed
dtype: string
- name: Emotion
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 22682634
num_examples: 20188
download_size: 6798524
dataset_size: 22682634
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama2_classifying_and_explainning_v3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
babs/multilingual-classification-dataset | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: language
struct:
- name: language
dtype: string
splits:
- name: train
num_bytes: 32443542061.635834
num_examples: 79786
download_size: 31344423962
dataset_size: 32443542061.635834
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tyzhu/synpre_union_1M | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 1167868421
num_examples: 1000000
- name: validation
num_bytes: 11660114
num_examples: 10000
download_size: 788391948
dataset_size: 1179528535
---
# Dataset Card for "synpre_union_1M"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Dippi9845/arxiv_with_fragments_clean | ---
license: cc-by-nc-sa-4.0
---
|
stanmalkinson199/TweekTweakPTBR | ---
license: openrail
---
|
liuyanchen1015/MULTI_VALUE_qqp_indefinite_for_definite_articles | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 3364363
num_examples: 20825
- name: test
num_bytes: 33293109
num_examples: 206423
- name: train
num_bytes: 30376125
num_examples: 187936
download_size: 41389280
dataset_size: 67033597
---
# Dataset Card for "MULTI_VALUE_qqp_indefinite_for_definite_articles"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
adityarra07/zurich_data | ---
dataset_info:
features:
- name: id
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 537406557.186
num_examples: 2189
download_size: 535954349
dataset_size: 537406557.186
---
# Dataset Card for "zurich_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MartinKu/whalley_dataset | ---
dataset_info:
features:
- name: TEXT
dtype: string
splits:
- name: train
num_bytes: 1462302
num_examples: 2682
download_size: 823459
dataset_size: 1462302
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "whalley_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nan-Do/instructional_code-search-net-ruby | ---
dataset_info:
features:
- name: INSTRUCTION
dtype: string
- name: RESPONSE
dtype: string
- name: SOURCE
dtype: string
splits:
- name: train
num_bytes: 30679722
num_examples: 51470
download_size: 12427089
dataset_size: 30679722
license: apache-2.0
task_categories:
- conversational
- text-generation
- text2text-generation
language:
- en
tags:
- Ruby
- Code Generation
- Instruction Response
pretty_name: Instructional Ruby Dataset
---
# Dataset Card for "instructional_code-search-net-ruby"
## Dataset Description
- **Homepage:** None
- **Repository:** https://huggingface.co/datasets/Nan-Do/instructional_code-search-net-ruby
- **Paper:** None
- **Leaderboard:** None
- **Point of Contact:** [@Nan-Do](https://github.com/Nan-Do)
### Dataset Summary
This is an instructional dataset for Ruby.
The dataset contains two different kind of tasks:
- Given a piece of code generate a description of what it does.
- Given a description generate a piece of code that fulfils the description.
### Languages
The dataset is in English.
### Data Splits
There are no splits.
## Dataset Creation
May of 2023
### Curation Rationale
This dataset was created to improve the coding capabilities of LLMs.
### Source Data
The summarized version of the code-search-net dataset can be found at https://huggingface.co/datasets/Nan-Do/code-search-net-ruby
### Annotations
The dataset includes an instruction and response columns.
#### Annotation process
The annotation procedure was done using templates and NLP techniques to generate human-like instructions and responses.
A sample notebook of the process can be found at https://github.com/Nan-Do/OpenAssistantInstructionResponsePython
The annontations have been cleaned to make sure there are no repetitions and/or meaningless summaries.
### Licensing Information
Apache 2.0 |
Back-up/food-100 | ---
dataset_info:
features:
- name: question
dtype: string
- name: options
list:
- name: answer
dtype: string
- name: key
dtype: string
- name: answer
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 102965
num_examples: 101
download_size: 26468
dataset_size: 102965
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "food-100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bnithish/question_difficulty | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 26750
num_examples: 67
download_size: 9307
dataset_size: 26750
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Doub7e/SDv2-Count-Repeated-6 | ---
dataset_info:
features:
- name: image
dtype: image
- name: prompt
dtype: string
- name: T5_last_hidden_states
sequence:
sequence:
sequence: float32
- name: style
dtype: string
splits:
- name: train
num_bytes: 1476740562.5
num_examples: 1140
download_size: 1286925000
dataset_size: 1476740562.5
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.