datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Multimodal-Fatima/CIFAR10_test_embeddings | ---
dataset_info:
features:
- name: image
dtype: image
- name: id
dtype: int64
- name: vision_embeddings
sequence: float32
splits:
- name: openai_clip_vit_large_patch14
num_bytes: 53491580.0
num_examples: 10000
download_size: 59803880
dataset_size: 53491580.0
---
# Dataset Card for "CIFAR10_test_embeddings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DataStudio/OCRWordLevelClear_02 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 5672588665.0
num_examples: 1825691
download_size: 5164200573
dataset_size: 5672588665.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bobfu/cats | ---
license: cc0-1.0
---
|
NetherlandsForensicInstitute/quora-duplicates-translated-nl | ---
viewer: true
task_categories:
- sentence-similarity
language:
- nl
size_categories:
- 100K<n<1M
---
This is a Dutch version of the [Quora Duplicates](https://quoradata.quora.com/First-Quora-Dataset-Release-Question-Pairs) dataset. Which we have auto-translated from English into Dutch using Meta's [No Language Left Behind](https://ai.facebook.com/research/no-language-left-behind/) model, specifically the [huggingface implementation](https://huggingface.co/facebook/nllb-200-distilled-600M). For more information about the use of this dataset please refer to the [Quora Terms of Service](https://www.quora.com/about/tos). |
open-llm-leaderboard/details_speakleash__Bielik-7B-v0.1 | ---
pretty_name: Evaluation run of speakleash/Bielik-7B-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [speakleash/Bielik-7B-v0.1](https://huggingface.co/speakleash/Bielik-7B-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_speakleash__Bielik-7B-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-09T23:35:06.556889](https://huggingface.co/datasets/open-llm-leaderboard/details_speakleash__Bielik-7B-v0.1/blob/main/results_2024-04-09T23-35-06.556889.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4716293062756141,\n\
\ \"acc_stderr\": 0.03461349827241483,\n \"acc_norm\": 0.47480802187603194,\n\
\ \"acc_norm_stderr\": 0.035346442062677605,\n \"mc1\": 0.27539779681762544,\n\
\ \"mc1_stderr\": 0.01563813566777552,\n \"mc2\": 0.4320429372007698,\n\
\ \"mc2_stderr\": 0.014925535179229217\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.41638225255972694,\n \"acc_stderr\": 0.014405618279436176,\n\
\ \"acc_norm\": 0.4522184300341297,\n \"acc_norm_stderr\": 0.014544519880633832\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5073690499900418,\n\
\ \"acc_stderr\": 0.004989239462835232,\n \"acc_norm\": 0.6792471619199363,\n\
\ \"acc_norm_stderr\": 0.004658120152230819\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4605263157894737,\n \"acc_stderr\": 0.04056242252249034,\n\
\ \"acc_norm\": 0.4605263157894737,\n \"acc_norm_stderr\": 0.04056242252249034\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.47547169811320755,\n \"acc_stderr\": 0.030735822206205608,\n\
\ \"acc_norm\": 0.47547169811320755,\n \"acc_norm_stderr\": 0.030735822206205608\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4624277456647399,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.4624277456647399,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.40425531914893614,\n \"acc_stderr\": 0.03208115750788684,\n\
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.03208115750788684\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.040493392977481425,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.040493392977481425\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3386243386243386,\n \"acc_stderr\": 0.024373197867983063,\n \"\
acc_norm\": 0.3386243386243386,\n \"acc_norm_stderr\": 0.024373197867983063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5516129032258065,\n\
\ \"acc_stderr\": 0.02829205683011274,\n \"acc_norm\": 0.5516129032258065,\n\
\ \"acc_norm_stderr\": 0.02829205683011274\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3399014778325123,\n \"acc_stderr\": 0.033327690684107895,\n\
\ \"acc_norm\": 0.3399014778325123,\n \"acc_norm_stderr\": 0.033327690684107895\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6242424242424243,\n \"acc_stderr\": 0.037818873532059816,\n\
\ \"acc_norm\": 0.6242424242424243,\n \"acc_norm_stderr\": 0.037818873532059816\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5909090909090909,\n \"acc_stderr\": 0.03502975799413007,\n \"\
acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.03502975799413007\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6010362694300518,\n \"acc_stderr\": 0.03533999094065696,\n\
\ \"acc_norm\": 0.6010362694300518,\n \"acc_norm_stderr\": 0.03533999094065696\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4205128205128205,\n \"acc_stderr\": 0.02502861027671086,\n \
\ \"acc_norm\": 0.4205128205128205,\n \"acc_norm_stderr\": 0.02502861027671086\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945284,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945284\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3949579831932773,\n \"acc_stderr\": 0.031753678460966245,\n\
\ \"acc_norm\": 0.3949579831932773,\n \"acc_norm_stderr\": 0.031753678460966245\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6036697247706422,\n \"acc_stderr\": 0.02097146994790053,\n \"\
acc_norm\": 0.6036697247706422,\n \"acc_norm_stderr\": 0.02097146994790053\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2824074074074074,\n \"acc_stderr\": 0.030701372111510923,\n \"\
acc_norm\": 0.2824074074074074,\n \"acc_norm_stderr\": 0.030701372111510923\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5784313725490197,\n \"acc_stderr\": 0.034658681963807614,\n \"\
acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.034658681963807614\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6540084388185654,\n \"acc_stderr\": 0.03096481058878671,\n \
\ \"acc_norm\": 0.6540084388185654,\n \"acc_norm_stderr\": 0.03096481058878671\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5426008968609866,\n\
\ \"acc_stderr\": 0.033435777055830646,\n \"acc_norm\": 0.5426008968609866,\n\
\ \"acc_norm_stderr\": 0.033435777055830646\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.48854961832061067,\n \"acc_stderr\": 0.043841400240780176,\n\
\ \"acc_norm\": 0.48854961832061067,\n \"acc_norm_stderr\": 0.043841400240780176\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6115702479338843,\n \"acc_stderr\": 0.04449270350068382,\n \"\
acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.04449270350068382\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5740740740740741,\n\
\ \"acc_stderr\": 0.0478034362693679,\n \"acc_norm\": 0.5740740740740741,\n\
\ \"acc_norm_stderr\": 0.0478034362693679\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5644171779141104,\n \"acc_stderr\": 0.03895632464138937,\n\
\ \"acc_norm\": 0.5644171779141104,\n \"acc_norm_stderr\": 0.03895632464138937\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6116504854368932,\n \"acc_stderr\": 0.048257293373563895,\n\
\ \"acc_norm\": 0.6116504854368932,\n \"acc_norm_stderr\": 0.048257293373563895\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7094017094017094,\n\
\ \"acc_stderr\": 0.029745048572674047,\n \"acc_norm\": 0.7094017094017094,\n\
\ \"acc_norm_stderr\": 0.029745048572674047\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6360153256704981,\n\
\ \"acc_stderr\": 0.017205684809032232,\n \"acc_norm\": 0.6360153256704981,\n\
\ \"acc_norm_stderr\": 0.017205684809032232\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4913294797687861,\n \"acc_stderr\": 0.0269150473553698,\n\
\ \"acc_norm\": 0.4913294797687861,\n \"acc_norm_stderr\": 0.0269150473553698\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2748603351955307,\n\
\ \"acc_stderr\": 0.014931316703220503,\n \"acc_norm\": 0.2748603351955307,\n\
\ \"acc_norm_stderr\": 0.014931316703220503\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.46405228758169936,\n \"acc_stderr\": 0.028555827516528784,\n\
\ \"acc_norm\": 0.46405228758169936,\n \"acc_norm_stderr\": 0.028555827516528784\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5305466237942122,\n\
\ \"acc_stderr\": 0.028345045864840625,\n \"acc_norm\": 0.5305466237942122,\n\
\ \"acc_norm_stderr\": 0.028345045864840625\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.02782074420373286,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.02782074420373286\n },\n\
\ \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3262411347517731,\n\
\ \"acc_stderr\": 0.027968453043563168,\n \"acc_norm\": 0.3262411347517731,\n\
\ \"acc_norm_stderr\": 0.027968453043563168\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.3376792698826597,\n \"acc_stderr\": 0.012078563777145552,\n\
\ \"acc_norm\": 0.3376792698826597,\n \"acc_norm_stderr\": 0.012078563777145552\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.4338235294117647,\n \"acc_stderr\": 0.030105636570016636,\n \"\
acc_norm\": 0.4338235294117647,\n \"acc_norm_stderr\": 0.030105636570016636\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.42810457516339867,\n \"acc_stderr\": 0.020017629214213108,\n \
\ \"acc_norm\": 0.42810457516339867,\n \"acc_norm_stderr\": 0.020017629214213108\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5454545454545454,\n\
\ \"acc_stderr\": 0.04769300568972744,\n \"acc_norm\": 0.5454545454545454,\n\
\ \"acc_norm_stderr\": 0.04769300568972744\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5183673469387755,\n \"acc_stderr\": 0.03198761546763127,\n\
\ \"acc_norm\": 0.5183673469387755,\n \"acc_norm_stderr\": 0.03198761546763127\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.681592039800995,\n\
\ \"acc_stderr\": 0.03294118479054095,\n \"acc_norm\": 0.681592039800995,\n\
\ \"acc_norm_stderr\": 0.03294118479054095\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6608187134502924,\n \"acc_stderr\": 0.03631053496488904,\n\
\ \"acc_norm\": 0.6608187134502924,\n \"acc_norm_stderr\": 0.03631053496488904\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27539779681762544,\n\
\ \"mc1_stderr\": 0.01563813566777552,\n \"mc2\": 0.4320429372007698,\n\
\ \"mc2_stderr\": 0.014925535179229217\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6685082872928176,\n \"acc_stderr\": 0.01323039719896465\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.29492039423805916,\n \
\ \"acc_stderr\": 0.012560698010954767\n }\n}\n```"
repo_url: https://huggingface.co/speakleash/Bielik-7B-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|arc:challenge|25_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|arc:challenge|25_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|gsm8k|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|gsm8k|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hellaswag|10_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hellaswag|10_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|winogrande|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|winogrande|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-09T23-35-06.556889.parquet'
- config_name: results
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- results_2024-04-07T13-58-19.064215.parquet
- split: 2024_04_09T23_35_06.556889
path:
- results_2024-04-09T23-35-06.556889.parquet
- split: latest
path:
- results_2024-04-09T23-35-06.556889.parquet
---
# Dataset Card for Evaluation run of speakleash/Bielik-7B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [speakleash/Bielik-7B-v0.1](https://huggingface.co/speakleash/Bielik-7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_speakleash__Bielik-7B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-09T23:35:06.556889](https://huggingface.co/datasets/open-llm-leaderboard/details_speakleash__Bielik-7B-v0.1/blob/main/results_2024-04-09T23-35-06.556889.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4716293062756141,
"acc_stderr": 0.03461349827241483,
"acc_norm": 0.47480802187603194,
"acc_norm_stderr": 0.035346442062677605,
"mc1": 0.27539779681762544,
"mc1_stderr": 0.01563813566777552,
"mc2": 0.4320429372007698,
"mc2_stderr": 0.014925535179229217
},
"harness|arc:challenge|25": {
"acc": 0.41638225255972694,
"acc_stderr": 0.014405618279436176,
"acc_norm": 0.4522184300341297,
"acc_norm_stderr": 0.014544519880633832
},
"harness|hellaswag|10": {
"acc": 0.5073690499900418,
"acc_stderr": 0.004989239462835232,
"acc_norm": 0.6792471619199363,
"acc_norm_stderr": 0.004658120152230819
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4605263157894737,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.4605263157894737,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.47547169811320755,
"acc_stderr": 0.030735822206205608,
"acc_norm": 0.47547169811320755,
"acc_norm_stderr": 0.030735822206205608
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4375,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.040493392977481425,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.040493392977481425
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4206896551724138,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.4206896551724138,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3386243386243386,
"acc_stderr": 0.024373197867983063,
"acc_norm": 0.3386243386243386,
"acc_norm_stderr": 0.024373197867983063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5516129032258065,
"acc_stderr": 0.02829205683011274,
"acc_norm": 0.5516129032258065,
"acc_norm_stderr": 0.02829205683011274
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3399014778325123,
"acc_stderr": 0.033327690684107895,
"acc_norm": 0.3399014778325123,
"acc_norm_stderr": 0.033327690684107895
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6242424242424243,
"acc_stderr": 0.037818873532059816,
"acc_norm": 0.6242424242424243,
"acc_norm_stderr": 0.037818873532059816
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.03502975799413007,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.03502975799413007
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6010362694300518,
"acc_stderr": 0.03533999094065696,
"acc_norm": 0.6010362694300518,
"acc_norm_stderr": 0.03533999094065696
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4205128205128205,
"acc_stderr": 0.02502861027671086,
"acc_norm": 0.4205128205128205,
"acc_norm_stderr": 0.02502861027671086
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945284,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945284
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3949579831932773,
"acc_stderr": 0.031753678460966245,
"acc_norm": 0.3949579831932773,
"acc_norm_stderr": 0.031753678460966245
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119995,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119995
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6036697247706422,
"acc_stderr": 0.02097146994790053,
"acc_norm": 0.6036697247706422,
"acc_norm_stderr": 0.02097146994790053
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2824074074074074,
"acc_stderr": 0.030701372111510923,
"acc_norm": 0.2824074074074074,
"acc_norm_stderr": 0.030701372111510923
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.034658681963807614,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.034658681963807614
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6540084388185654,
"acc_stderr": 0.03096481058878671,
"acc_norm": 0.6540084388185654,
"acc_norm_stderr": 0.03096481058878671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5426008968609866,
"acc_stderr": 0.033435777055830646,
"acc_norm": 0.5426008968609866,
"acc_norm_stderr": 0.033435777055830646
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.48854961832061067,
"acc_stderr": 0.043841400240780176,
"acc_norm": 0.48854961832061067,
"acc_norm_stderr": 0.043841400240780176
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.04449270350068382,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.04449270350068382
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.0478034362693679,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.0478034362693679
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5644171779141104,
"acc_stderr": 0.03895632464138937,
"acc_norm": 0.5644171779141104,
"acc_norm_stderr": 0.03895632464138937
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.6116504854368932,
"acc_stderr": 0.048257293373563895,
"acc_norm": 0.6116504854368932,
"acc_norm_stderr": 0.048257293373563895
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7094017094017094,
"acc_stderr": 0.029745048572674047,
"acc_norm": 0.7094017094017094,
"acc_norm_stderr": 0.029745048572674047
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6360153256704981,
"acc_stderr": 0.017205684809032232,
"acc_norm": 0.6360153256704981,
"acc_norm_stderr": 0.017205684809032232
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4913294797687861,
"acc_stderr": 0.0269150473553698,
"acc_norm": 0.4913294797687861,
"acc_norm_stderr": 0.0269150473553698
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2748603351955307,
"acc_stderr": 0.014931316703220503,
"acc_norm": 0.2748603351955307,
"acc_norm_stderr": 0.014931316703220503
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.46405228758169936,
"acc_stderr": 0.028555827516528784,
"acc_norm": 0.46405228758169936,
"acc_norm_stderr": 0.028555827516528784
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5305466237942122,
"acc_stderr": 0.028345045864840625,
"acc_norm": 0.5305466237942122,
"acc_norm_stderr": 0.028345045864840625
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5,
"acc_stderr": 0.02782074420373286,
"acc_norm": 0.5,
"acc_norm_stderr": 0.02782074420373286
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3262411347517731,
"acc_stderr": 0.027968453043563168,
"acc_norm": 0.3262411347517731,
"acc_norm_stderr": 0.027968453043563168
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3376792698826597,
"acc_stderr": 0.012078563777145552,
"acc_norm": 0.3376792698826597,
"acc_norm_stderr": 0.012078563777145552
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4338235294117647,
"acc_stderr": 0.030105636570016636,
"acc_norm": 0.4338235294117647,
"acc_norm_stderr": 0.030105636570016636
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.42810457516339867,
"acc_stderr": 0.020017629214213108,
"acc_norm": 0.42810457516339867,
"acc_norm_stderr": 0.020017629214213108
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.04769300568972744,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.04769300568972744
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5183673469387755,
"acc_stderr": 0.03198761546763127,
"acc_norm": 0.5183673469387755,
"acc_norm_stderr": 0.03198761546763127
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.681592039800995,
"acc_stderr": 0.03294118479054095,
"acc_norm": 0.681592039800995,
"acc_norm_stderr": 0.03294118479054095
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6608187134502924,
"acc_stderr": 0.03631053496488904,
"acc_norm": 0.6608187134502924,
"acc_norm_stderr": 0.03631053496488904
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27539779681762544,
"mc1_stderr": 0.01563813566777552,
"mc2": 0.4320429372007698,
"mc2_stderr": 0.014925535179229217
},
"harness|winogrande|5": {
"acc": 0.6685082872928176,
"acc_stderr": 0.01323039719896465
},
"harness|gsm8k|5": {
"acc": 0.29492039423805916,
"acc_stderr": 0.012560698010954767
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
jlh-ibm/earnings_call | ---
license: cc0-1.0
task_categories:
- text-classification
language:
- en
tags:
- finance
pretty_name: Earnings Calls Dataset
size_categories:
- 10K<n<100K
dataset_info:
- config_name: stock_prices
features:
- name: date
dtype: date64
- name: open
dtype: float32
- name: high
dtype: float32
- name: low
dtype: float32
- name: close
dtype: float32
- name: adj_close
dtype: float32
- name: volume
dtype: int64
- name: company
dtype: string
splits:
- name: train
num_bytes: 578818
num_examples: 13155
download_size: 290243
dataset_size: 578818
- config_name: transcript-sentiment
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': negative
'1': positive
- name: company
dtype: string
- name: date
dtype: date64
- name: para_no
dtype: int32
splits:
- name: train
num_bytes: 7414686
num_examples: 6851
- name: test
num_bytes: 1928515
num_examples: 1693
download_size: 3868059
dataset_size: 9343201
- config_name: transcripts
features:
- name: company
dtype: string
- name: date
dtype: date64
- name: transcript
dtype: string
splits:
- name: train
num_bytes: 9592380
num_examples: 150
- name: test
num_bytes: 2458569
num_examples: 38
download_size: 3577816
dataset_size: 12050949
---
# Dataset Card for Earnings Calls Dataset
## Dataset Description
- **Homepage:** https://dataverse.nl/dataset.xhtml?persistentId=doi:10.34894/TJE0D0
- **Paper:** https://www.preprints.org/manuscript/202102.0424/v1
- **Point of Contact:** [Francesco Lelli](https://francescolelli.info/)
### Dataset Summary
The dataset reports a collection of earnings call transcripts, the related stock prices, and the sector index In terms of volume,
there is a total of 188 transcripts, 11970 stock prices, and 1196 sector index values. Furthermore, all of these data originated
in the period 2016-2020 and are related to the NASDAQ stock market. Furthermore, the data collection was made possible by Yahoo
Finance and Thomson Reuters Eikon. Specifically, Yahoo Finance enabled the search for stock values and Thomson Reuters Eikon
provided the earnings call transcripts. Lastly, the dataset can be used as a benchmark for the evaluation of several NLP techniques
to understand their potential for financial applications. Moreover, it is also possible to expand the dataset by extending the period
in which the data originated following a similar procedure.
### Citation Information
```bibtex
@data{TJE0D0_2021,
author = {Roozen, Dexter and Lelli, Francesco},
publisher = {DataverseNL},
title = {{Stock Values and Earnings Call Transcripts: a Sentiment Analysis Dataset}},
year = {2021},
version = {V1},
doi = {10.34894/TJE0D0},
url = {https://doi.org/10.34894/TJE0D0}
}
```
|
softcatala/ca_text_corpus | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- ca
license:
- cc0-1.0
multilinguality:
- monolingual
size_categories:
- 1M<n<10M
source_datasets:
- original
task_categories:
- text-generation
task_ids:
- language-modeling
pretty_name: ca-text-corpus
---
# Dataset Card for ca-text-corpus
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:** https://github.com/Softcatala/ca-text-corpus
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
Public domain corpus of Catalan text.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
Catalan (`ca`).
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[CC0 1.0 Universal](https://creativecommons.org/publicdomain/zero/1.0/).
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@albertvillanova](https://github.com/albertvillanova) for adding this dataset.
|
syedhuq/newXYZ | ---
license: llama2
---
|
isaacrehg/poetry-summary | ---
dataset_info:
features:
- name: _id
dtype: int64
- name: title
dtype: string
- name: author
dtype: string
- name: url
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 1881481
num_examples: 3420
download_size: 1080975
dataset_size: 1881481
---
# Dataset Card for "poetry-summary"
This dataset contains scraped poem summarizations. Poems in this dataset also appear in [isaacrehg/poetry-detailed-analysis](https://huggingface.co/datasets/isaacrehg/poetry-detailed-analysis).
Each row contains the following data:
- _id: ID of this poem (for reference in [isaacrehg/poetry-detailed-analysis](https://huggingface.co/datasets/isaacrehg/poetry-detailed-analysis))
- title: The title of the poem
- author: The poem's author
- summary: The crawled summarization for this poem
|
tarudesu/ViCTSD | ---
task_categories:
- text-classification
language:
- vi
size_categories:
- 10K<n<100K
pretty_name: Vietnamese Constructive and Toxic Speech Detection Dataset
---
# Constructive and Toxic Speech Detection for Open-domain Social Media Comments in Vietnamese
This is the official repository for the UIT-ViCTSD dataset from the paper [Constructive and Toxic Speech Detection for Open-domain Social Media Comments in Vietnamese](https://arxiv.org/pdf/2103.10069.pdf), which was accepted at the [IEA/AIE 2021](https://ieaaie2021.wordpress.com/list-of-accepted-papers/).
# Citation Information
The provided dataset is only used for research purposes!
```
@InProceedings{nguyen2021victsd,
author="Nguyen, Luan Thanh and Van Nguyen, Kiet and Nguyen, Ngan Luu-Thuy",
title="Constructive and Toxic Speech Detection for Open-Domain Social Media Comments in Vietnamese",
booktitle="Advances and Trends in Artificial Intelligence. Artificial Intelligence Practices",
year="2021",
publisher="Springer International Publishing",
address="Cham",
pages="572--583"
}
```
## Abstract
The rise of social media has led to the increasing of comments on online forums. However, there still exists invalid comments which are not informative for users. Moreover, those comments are also quite toxic and harmful to people. In this paper, we create a dataset for constructive and toxic speech detection, named UIT-ViCTSD (Vietnamese Constructive and Toxic Speech Detection dataset) with 10,000 human-annotated comments. For these tasks, we propose a system for constructive and toxic speech detection with the state-of-the-art transfer learning model in Vietnamese NLP as PhoBERT. With this system, we obtain F1-scores of 78.59% and 59.40% for classifying constructive and toxic comments, respectively. Besides, we implement various baseline models as traditional Machine Learning and Deep Neural Network-Based models to evaluate the dataset. With the results, we can solve several tasks on the online discussions and develop the framework for identifying constructiveness and toxicity of Vietnamese social media comments automatically.
## Dataset
The ViCTSD dataset is consist of 10,000 human-annotated comments on 10 domains from Vietnamese users' comments on social media.
The dataset is divided into three parts as below:
1. Train set: 7,000 comments
2. Valid set: 2,000 comments
3. Test set: 1,000 comments
## Contact
Please feel free to contact us by email luannt@uit.edu.vn if you have any further information! |
irds/codesearchnet | ---
pretty_name: '`codesearchnet`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `codesearchnet`
The `codesearchnet` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/codesearchnet#codesearchnet).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=2,070,536
This dataset is used by: [`codesearchnet_challenge`](https://huggingface.co/datasets/irds/codesearchnet_challenge), [`codesearchnet_test`](https://huggingface.co/datasets/irds/codesearchnet_test), [`codesearchnet_train`](https://huggingface.co/datasets/irds/codesearchnet_train), [`codesearchnet_valid`](https://huggingface.co/datasets/irds/codesearchnet_valid)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/codesearchnet', 'docs')
for record in docs:
record # {'doc_id': ..., 'repo': ..., 'path': ..., 'func_name': ..., 'code': ..., 'language': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Husain2019CodeSearchNet,
title={CodeSearchNet Challenge: Evaluating the State of Semantic Code Search},
author={Hamel Husain and Ho-Hsiang Wu and Tiferet Gazit and Miltiadis Allamanis and Marc Brockschmidt},
journal={ArXiv},
year={2019}
}
```
|
open-llm-leaderboard/details_AA051611__whattest | ---
pretty_name: Evaluation run of AA051611/whattest
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AA051611/whattest](https://huggingface.co/AA051611/whattest) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051611__whattest\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-11T14:53:31.657383](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051611__whattest/blob/main/results_2024-01-11T14-53-31.657383.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7616138245668181,\n\
\ \"acc_stderr\": 0.028047923748497502,\n \"acc_norm\": 0.7655989871774836,\n\
\ \"acc_norm_stderr\": 0.02857778553593116,\n \"mc1\": 0.4149326805385557,\n\
\ \"mc1_stderr\": 0.017248314465805978,\n \"mc2\": 0.5803674233221108,\n\
\ \"mc2_stderr\": 0.014839457098843786\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6348122866894198,\n \"acc_stderr\": 0.014070265519268804,\n\
\ \"acc_norm\": 0.6680887372013652,\n \"acc_norm_stderr\": 0.013760988200880534\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6463851822346146,\n\
\ \"acc_stderr\": 0.0047711430744261304,\n \"acc_norm\": 0.8442541326428998,\n\
\ \"acc_norm_stderr\": 0.003618731658837713\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.03944624162501116,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.03944624162501116\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8947368421052632,\n \"acc_stderr\": 0.024974533450920697,\n\
\ \"acc_norm\": 0.8947368421052632,\n \"acc_norm_stderr\": 0.024974533450920697\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.81,\n\
\ \"acc_stderr\": 0.03942772444036622,\n \"acc_norm\": 0.81,\n \
\ \"acc_norm_stderr\": 0.03942772444036622\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8415094339622642,\n \"acc_stderr\": 0.02247652871016771,\n\
\ \"acc_norm\": 0.8415094339622642,\n \"acc_norm_stderr\": 0.02247652871016771\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9027777777777778,\n\
\ \"acc_stderr\": 0.024774516250440175,\n \"acc_norm\": 0.9027777777777778,\n\
\ \"acc_norm_stderr\": 0.024774516250440175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5784313725490197,\n \"acc_stderr\": 0.049135952012745024,\n\
\ \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.049135952012745024\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7829787234042553,\n \"acc_stderr\": 0.02694748312149622,\n\
\ \"acc_norm\": 0.7829787234042553,\n \"acc_norm_stderr\": 0.02694748312149622\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5701754385964912,\n\
\ \"acc_stderr\": 0.04657047260594964,\n \"acc_norm\": 0.5701754385964912,\n\
\ \"acc_norm_stderr\": 0.04657047260594964\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7586206896551724,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.7586206896551724,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.6851851851851852,\n \"acc_stderr\": 0.023919984164047732,\n \"\
acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.023919984164047732\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5952380952380952,\n\
\ \"acc_stderr\": 0.04390259265377563,\n \"acc_norm\": 0.5952380952380952,\n\
\ \"acc_norm_stderr\": 0.04390259265377563\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.9,\n \"acc_stderr\": 0.017066403719657255,\n \"acc_norm\": 0.9,\n\
\ \"acc_norm_stderr\": 0.017066403719657255\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6305418719211823,\n \"acc_stderr\": 0.033959703819985726,\n\
\ \"acc_norm\": 0.6305418719211823,\n \"acc_norm_stderr\": 0.033959703819985726\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\"\
: 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865394,\n\
\ \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865394\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9191919191919192,\n \"acc_stderr\": 0.019417681889724536,\n \"\
acc_norm\": 0.9191919191919192,\n \"acc_norm_stderr\": 0.019417681889724536\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9740932642487047,\n \"acc_stderr\": 0.01146452335695318,\n\
\ \"acc_norm\": 0.9740932642487047,\n \"acc_norm_stderr\": 0.01146452335695318\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7974358974358975,\n \"acc_stderr\": 0.020377660970371397,\n\
\ \"acc_norm\": 0.7974358974358975,\n \"acc_norm_stderr\": 0.020377660970371397\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4111111111111111,\n \"acc_stderr\": 0.02999992350870669,\n \
\ \"acc_norm\": 0.4111111111111111,\n \"acc_norm_stderr\": 0.02999992350870669\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8487394957983193,\n \"acc_stderr\": 0.02327425589870796,\n \
\ \"acc_norm\": 0.8487394957983193,\n \"acc_norm_stderr\": 0.02327425589870796\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4900662251655629,\n \"acc_stderr\": 0.04081677107248436,\n \"\
acc_norm\": 0.4900662251655629,\n \"acc_norm_stderr\": 0.04081677107248436\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9247706422018349,\n \"acc_stderr\": 0.011308662537571743,\n \"\
acc_norm\": 0.9247706422018349,\n \"acc_norm_stderr\": 0.011308662537571743\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6435185185185185,\n \"acc_stderr\": 0.032664783315272714,\n \"\
acc_norm\": 0.6435185185185185,\n \"acc_norm_stderr\": 0.032664783315272714\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073315,\n \"\
acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073315\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.919831223628692,\n \"acc_stderr\": 0.017676679991891632,\n \
\ \"acc_norm\": 0.919831223628692,\n \"acc_norm_stderr\": 0.017676679991891632\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8116591928251121,\n\
\ \"acc_stderr\": 0.026241132996407256,\n \"acc_norm\": 0.8116591928251121,\n\
\ \"acc_norm_stderr\": 0.026241132996407256\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.9007633587786259,\n \"acc_stderr\": 0.026222235171477364,\n\
\ \"acc_norm\": 0.9007633587786259,\n \"acc_norm_stderr\": 0.026222235171477364\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9008264462809917,\n \"acc_stderr\": 0.027285246312758957,\n \"\
acc_norm\": 0.9008264462809917,\n \"acc_norm_stderr\": 0.027285246312758957\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n\
\ \"acc_stderr\": 0.02923927267563274,\n \"acc_norm\": 0.8981481481481481,\n\
\ \"acc_norm_stderr\": 0.02923927267563274\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8650306748466258,\n \"acc_stderr\": 0.026845765054553855,\n\
\ \"acc_norm\": 0.8650306748466258,\n \"acc_norm_stderr\": 0.026845765054553855\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5625,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.5625,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.912621359223301,\n \"acc_stderr\": 0.027960689125970654,\n\
\ \"acc_norm\": 0.912621359223301,\n \"acc_norm_stderr\": 0.027960689125970654\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9188034188034188,\n\
\ \"acc_stderr\": 0.01789378490401854,\n \"acc_norm\": 0.9188034188034188,\n\
\ \"acc_norm_stderr\": 0.01789378490401854\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9003831417624522,\n\
\ \"acc_stderr\": 0.010709685591251671,\n \"acc_norm\": 0.9003831417624522,\n\
\ \"acc_norm_stderr\": 0.010709685591251671\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8092485549132948,\n \"acc_stderr\": 0.021152676966575277,\n\
\ \"acc_norm\": 0.8092485549132948,\n \"acc_norm_stderr\": 0.021152676966575277\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7251396648044692,\n\
\ \"acc_stderr\": 0.014931316703220508,\n \"acc_norm\": 0.7251396648044692,\n\
\ \"acc_norm_stderr\": 0.014931316703220508\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8366013071895425,\n \"acc_stderr\": 0.021170623011213512,\n\
\ \"acc_norm\": 0.8366013071895425,\n \"acc_norm_stderr\": 0.021170623011213512\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.819935691318328,\n\
\ \"acc_stderr\": 0.02182342285774494,\n \"acc_norm\": 0.819935691318328,\n\
\ \"acc_norm_stderr\": 0.02182342285774494\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.017486432785880704,\n\
\ \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.017486432785880704\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6276595744680851,\n \"acc_stderr\": 0.02883892147125145,\n \
\ \"acc_norm\": 0.6276595744680851,\n \"acc_norm_stderr\": 0.02883892147125145\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6140808344198174,\n\
\ \"acc_stderr\": 0.012433398911476138,\n \"acc_norm\": 0.6140808344198174,\n\
\ \"acc_norm_stderr\": 0.012433398911476138\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8786764705882353,\n \"acc_stderr\": 0.01983363748105792,\n\
\ \"acc_norm\": 0.8786764705882353,\n \"acc_norm_stderr\": 0.01983363748105792\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.815359477124183,\n \"acc_stderr\": 0.015697029240757776,\n \
\ \"acc_norm\": 0.815359477124183,\n \"acc_norm_stderr\": 0.015697029240757776\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8408163265306122,\n \"acc_stderr\": 0.023420972069166338,\n\
\ \"acc_norm\": 0.8408163265306122,\n \"acc_norm_stderr\": 0.023420972069166338\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n\
\ \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n\
\ \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \
\ \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.9005847953216374,\n \"acc_stderr\": 0.022949025579355027,\n\
\ \"acc_norm\": 0.9005847953216374,\n \"acc_norm_stderr\": 0.022949025579355027\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4149326805385557,\n\
\ \"mc1_stderr\": 0.017248314465805978,\n \"mc2\": 0.5803674233221108,\n\
\ \"mc2_stderr\": 0.014839457098843786\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.824782951854775,\n \"acc_stderr\": 0.010684179227706179\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6944655041698257,\n \
\ \"acc_stderr\": 0.012688134076726882\n }\n}\n```"
repo_url: https://huggingface.co/AA051611/whattest
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|arc:challenge|25_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|gsm8k|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hellaswag|10_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T14-53-31.657383.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-11T14-53-31.657383.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- '**/details_harness|winogrande|5_2024-01-11T14-53-31.657383.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-11T14-53-31.657383.parquet'
- config_name: results
data_files:
- split: 2024_01_11T14_53_31.657383
path:
- results_2024-01-11T14-53-31.657383.parquet
- split: latest
path:
- results_2024-01-11T14-53-31.657383.parquet
---
# Dataset Card for Evaluation run of AA051611/whattest
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AA051611/whattest](https://huggingface.co/AA051611/whattest) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051611__whattest",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-11T14:53:31.657383](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051611__whattest/blob/main/results_2024-01-11T14-53-31.657383.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7616138245668181,
"acc_stderr": 0.028047923748497502,
"acc_norm": 0.7655989871774836,
"acc_norm_stderr": 0.02857778553593116,
"mc1": 0.4149326805385557,
"mc1_stderr": 0.017248314465805978,
"mc2": 0.5803674233221108,
"mc2_stderr": 0.014839457098843786
},
"harness|arc:challenge|25": {
"acc": 0.6348122866894198,
"acc_stderr": 0.014070265519268804,
"acc_norm": 0.6680887372013652,
"acc_norm_stderr": 0.013760988200880534
},
"harness|hellaswag|10": {
"acc": 0.6463851822346146,
"acc_stderr": 0.0047711430744261304,
"acc_norm": 0.8442541326428998,
"acc_norm_stderr": 0.003618731658837713
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.03944624162501116,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.03944624162501116
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8947368421052632,
"acc_stderr": 0.024974533450920697,
"acc_norm": 0.8947368421052632,
"acc_norm_stderr": 0.024974533450920697
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036622,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036622
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8415094339622642,
"acc_stderr": 0.02247652871016771,
"acc_norm": 0.8415094339622642,
"acc_norm_stderr": 0.02247652871016771
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9027777777777778,
"acc_stderr": 0.024774516250440175,
"acc_norm": 0.9027777777777778,
"acc_norm_stderr": 0.024774516250440175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.049135952012745024,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.049135952012745024
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7829787234042553,
"acc_stderr": 0.02694748312149622,
"acc_norm": 0.7829787234042553,
"acc_norm_stderr": 0.02694748312149622
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5701754385964912,
"acc_stderr": 0.04657047260594964,
"acc_norm": 0.5701754385964912,
"acc_norm_stderr": 0.04657047260594964
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7586206896551724,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.7586206896551724,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.023919984164047732,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.023919984164047732
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5952380952380952,
"acc_stderr": 0.04390259265377563,
"acc_norm": 0.5952380952380952,
"acc_norm_stderr": 0.04390259265377563
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9,
"acc_stderr": 0.017066403719657255,
"acc_norm": 0.9,
"acc_norm_stderr": 0.017066403719657255
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6305418719211823,
"acc_stderr": 0.033959703819985726,
"acc_norm": 0.6305418719211823,
"acc_norm_stderr": 0.033959703819985726
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.027045948825865394,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.027045948825865394
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9191919191919192,
"acc_stderr": 0.019417681889724536,
"acc_norm": 0.9191919191919192,
"acc_norm_stderr": 0.019417681889724536
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9740932642487047,
"acc_stderr": 0.01146452335695318,
"acc_norm": 0.9740932642487047,
"acc_norm_stderr": 0.01146452335695318
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7974358974358975,
"acc_stderr": 0.020377660970371397,
"acc_norm": 0.7974358974358975,
"acc_norm_stderr": 0.020377660970371397
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4111111111111111,
"acc_stderr": 0.02999992350870669,
"acc_norm": 0.4111111111111111,
"acc_norm_stderr": 0.02999992350870669
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8487394957983193,
"acc_stderr": 0.02327425589870796,
"acc_norm": 0.8487394957983193,
"acc_norm_stderr": 0.02327425589870796
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4900662251655629,
"acc_stderr": 0.04081677107248436,
"acc_norm": 0.4900662251655629,
"acc_norm_stderr": 0.04081677107248436
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9247706422018349,
"acc_stderr": 0.011308662537571743,
"acc_norm": 0.9247706422018349,
"acc_norm_stderr": 0.011308662537571743
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6435185185185185,
"acc_stderr": 0.032664783315272714,
"acc_norm": 0.6435185185185185,
"acc_norm_stderr": 0.032664783315272714
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073315,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073315
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.919831223628692,
"acc_stderr": 0.017676679991891632,
"acc_norm": 0.919831223628692,
"acc_norm_stderr": 0.017676679991891632
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8116591928251121,
"acc_stderr": 0.026241132996407256,
"acc_norm": 0.8116591928251121,
"acc_norm_stderr": 0.026241132996407256
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.9007633587786259,
"acc_stderr": 0.026222235171477364,
"acc_norm": 0.9007633587786259,
"acc_norm_stderr": 0.026222235171477364
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9008264462809917,
"acc_stderr": 0.027285246312758957,
"acc_norm": 0.9008264462809917,
"acc_norm_stderr": 0.027285246312758957
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8981481481481481,
"acc_stderr": 0.02923927267563274,
"acc_norm": 0.8981481481481481,
"acc_norm_stderr": 0.02923927267563274
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8650306748466258,
"acc_stderr": 0.026845765054553855,
"acc_norm": 0.8650306748466258,
"acc_norm_stderr": 0.026845765054553855
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5625,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.912621359223301,
"acc_stderr": 0.027960689125970654,
"acc_norm": 0.912621359223301,
"acc_norm_stderr": 0.027960689125970654
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9188034188034188,
"acc_stderr": 0.01789378490401854,
"acc_norm": 0.9188034188034188,
"acc_norm_stderr": 0.01789378490401854
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9003831417624522,
"acc_stderr": 0.010709685591251671,
"acc_norm": 0.9003831417624522,
"acc_norm_stderr": 0.010709685591251671
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8092485549132948,
"acc_stderr": 0.021152676966575277,
"acc_norm": 0.8092485549132948,
"acc_norm_stderr": 0.021152676966575277
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7251396648044692,
"acc_stderr": 0.014931316703220508,
"acc_norm": 0.7251396648044692,
"acc_norm_stderr": 0.014931316703220508
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8366013071895425,
"acc_stderr": 0.021170623011213512,
"acc_norm": 0.8366013071895425,
"acc_norm_stderr": 0.021170623011213512
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.819935691318328,
"acc_stderr": 0.02182342285774494,
"acc_norm": 0.819935691318328,
"acc_norm_stderr": 0.02182342285774494
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.017486432785880704,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.017486432785880704
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6276595744680851,
"acc_stderr": 0.02883892147125145,
"acc_norm": 0.6276595744680851,
"acc_norm_stderr": 0.02883892147125145
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6140808344198174,
"acc_stderr": 0.012433398911476138,
"acc_norm": 0.6140808344198174,
"acc_norm_stderr": 0.012433398911476138
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8786764705882353,
"acc_stderr": 0.01983363748105792,
"acc_norm": 0.8786764705882353,
"acc_norm_stderr": 0.01983363748105792
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.815359477124183,
"acc_stderr": 0.015697029240757776,
"acc_norm": 0.815359477124183,
"acc_norm_stderr": 0.015697029240757776
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8408163265306122,
"acc_stderr": 0.023420972069166338,
"acc_norm": 0.8408163265306122,
"acc_norm_stderr": 0.023420972069166338
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.9005847953216374,
"acc_stderr": 0.022949025579355027,
"acc_norm": 0.9005847953216374,
"acc_norm_stderr": 0.022949025579355027
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4149326805385557,
"mc1_stderr": 0.017248314465805978,
"mc2": 0.5803674233221108,
"mc2_stderr": 0.014839457098843786
},
"harness|winogrande|5": {
"acc": 0.824782951854775,
"acc_stderr": 0.010684179227706179
},
"harness|gsm8k|5": {
"acc": 0.6944655041698257,
"acc_stderr": 0.012688134076726882
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
zakiasalod/VPAgs-Dataset4ML | ---
license: cc-by-4.0
task_categories:
- text-classification
tags:
- public health
- bioinformatics
- virus
- proteomics
- vaccine development
- antigen
- machine learning
- reverse vaccinology
- viral proteins
- protegen
- uniprot
pretty_name: VPAgs-Dataset4ML
size_categories:
- 1K<n<10K
---
# Dataset Card for VPAgs-Dataset4ML
## Dataset Details
### Dataset Description
**VPAgs-Dataset4ML** comprises 2,145 viral protein sequences, curated to facilitate the development of machine learning models capable of predicting viral protective antigens (PAgs). These antigens are crucial for designing vaccines against various viral pathogens. The dataset is divided into two categories: 210 protective antigens (positive class) and 1,935 non-protective protein sequences (negative class), derived from the Protegen database and UniProt, respectively. This collection aims to support and accelerate research in reverse vaccinology, providing a valuable resource for bioinformatics and public health.
- **Curated by:** Zakia Salod from the University of KwaZulu-Natal and Ozayr Mahomed from the University of KwaZulu-Natal and Dasman Diabetes Institute.
- **Funded by** National Research Foundation (NRF) of South Africa (grant number 130187) and College of Health Sciences (CHS) of the University of KwaZulu-Natal (UKZN) in Durban, Kwa-Zulu-Natal, South Africa.
- **Language(s) (NLP):** English.
- **License:** [CC BY 4.0](https://creativecommons.org/licenses/by/4.0/)
### Dataset Sources
- **Repository:** Mendeley Data - [VPAgs-Dataset4ML](https://doi.org/10.17632/w78tyrjz4z.1)
- **Paper** Salod, Z.; Mahomed, O. VPAgs-Dataset4ML: A Dataset to Predict Viral Protective Antigens for Machine Learning-Based Reverse Vaccinology. Data 2023, 8, 41. [https://doi.org/10.3390/data8020041](https://doi.org/10.3390/data8020041).
## Uses
### Direct Use
This dataset serves as an invaluable asset for developing and testing machine learning algorithms aimed at identifying potential vaccine candidates. Its application extends beyond academic research, offering insights that could significantly impact vaccine development strategies, particularly in the realm of emerging viral threats.
## Dataset Structure
### Data Instances
```
{
"sequence": "MATLLRSLALFKRNKDKPPITSGSGGAIRGIKHIIIVPIPGDSSITTRSRLLDRLVRLIGNPDVSGPKLTGALIGILSLFVESPGQLIQRITDDPDVSIRLLEVVQSDQSQSGLTFASRGTNMEDEADQYFSHDDPSSSDQSRSGWFENKEISDIEVQDPEGFNMILGTILAQIWVLLAKAVTAPDTAADSELRRWIKYTQQRRVVGEFRLERKWLDVVRNRIAEDLSLRRFMVALILDIKRTPGNKPRIAEMICDIDTYIVEAGLASFILTIKFGIETMYPALGLHEFAGELSTLESLMNLYQQMGETAPYMVILENSIQNKFSAGSYPLLWSYAMGVGVELENSMGGLNFGRSYFDPAYFRLGQEMVRRSAGKVSSTLASELGITAEDARLVSEIAMHTTEDRISRAVGPRQAQVSFLHGDQSENELPGLGGKEDRRVKQGRGEARESYRETGSSRASDARAAHPPTSMPLDIDTASESGQDPQDSRRSADALLRLQAMAGILEEQGSDTDTPRVYNDRDLLD",
"label": "1"
}
```
### Data Fields
- `sequence`: A string representing the amino acid sequence of a viral protein.
- `label`: An integer indicating whether the sequence is a protective antigen (1) or not (0).
### Data Splits
The dataset has not been split into training and testing sets, to allow for flexibility.
You may split the dataset into training and testing sets, based on your preferred ratio.
## Dataset Creation
### Curation Rationale
The dataset was curated to address the need for a machine learning-ready dataset containing labeled protective (positive) and non-protective (negative) viral protein sequences. This dataset facilitates the development of machine learning models for predicting viral protective antigens, which are crucial for reverse vaccinology and the development of effective vaccines against viral pathogens.
### Source Data
#### Data Collection and Processing
The dataset was compiled through a meticulous process involving the retrieval of viral PAgs with experimental evidence from the [Protegen](https://violinet.org/protegen/) database, followed by computational steps carried out on viral protein sequences in [UniProt](https://www.uniprot.org/) to select non-protective protein sequences.
## Bias, Risks, and Limitations
Given the imbalanced nature of the dataset, with a greater number of non-protective than protective sequences, there's a risk that machine learning models may become biased towards predicting the majority class. To mitigate this, researchers are encouraged to implement strategies such as balanced sampling or weighted loss functions during model training. Additionally, the dataset's focus on viral proteins from specific databases might limit its coverage of all potential protective antigens across the viral kingdom, which should be considered when generalizing findings.
## Citation
**BibTeX:**
```bibtex
@article{salod2023vpags,
title={VPAgs-Dataset4ML: A Dataset to Predict Viral Protective Antigens for Machine Learning-Based Reverse Vaccinology},
author={Salod, Zakia and Mahomed, Ozayr},
journal={Data},
volume={8},
number={41},
year={2023},
publisher={MDPI},
doi={10.3390/data8020041}
}
```
**APA:**
Salod, Z., & Mahomed, O. (2023). VPAgs-Dataset4ML: A Dataset to Predict Viral Protective Antigens for Machine Learning-Based Reverse Vaccinology. Data, 8(41). https://doi.org/10.3390/data8020041
## More Information
This dataset is a crucial step towards leveraging machine learning in the field of vaccinology. By providing a high-quality, curated dataset, VPAgs-Dataset4ML facilitates the development of predictive models that can identify promising vaccine candidates, potentially accelerating vaccine development and deployment in response to emerging viral threats.
## Dataset Card Authors
Zakia Salod, Ozayr Mahomed
## Dataset Card Contact
For any inquiries regarding this dataset, please contact Zakia Salod at [zakia.salod@gmail.com](zakia.salod@gmail.com).
|
FanChen0116/few7_19100_chat_time4x | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: labels
sequence:
class_label:
names:
'0': O
'1': I-time
'2': B-date
'3': B-last_name
'4': B-people
'5': I-date
'6': I-people
'7': I-last_name
'8': I-first_name
'9': B-first_name
'10': B-time
- name: request_slot
sequence: string
splits:
- name: train
num_bytes: 57133
num_examples: 314
- name: validation
num_bytes: 998
num_examples: 6
- name: test
num_bytes: 646729
num_examples: 3731
download_size: 0
dataset_size: 704860
---
# Dataset Card for "few7_19100_chat_time4x"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/m38_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of m38/M38/伯莱塔38型 (Girls' Frontline)
This is the dataset of m38/M38/伯莱塔38型 (Girls' Frontline), containing 10 images and their tags.
The core tags of this character are `blue_eyes, long_hair, ahoge, hat, bangs, beret, hair_ornament, brown_hair, hairclip, breasts, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 10 | 13.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m38_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 10 | 6.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m38_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 22 | 13.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m38_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 10 | 10.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m38_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 22 | 20.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m38_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/m38_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, simple_background, solo, white_shirt, long_sleeves, pleated_skirt, submachine_gun, white_background, white_thighhighs, black_footwear, black_skirt, closed_mouth, holding_gun, jacket, military_uniform, red_necktie, loafers, belt, blush, collared_shirt, full_body, standing |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | simple_background | solo | white_shirt | long_sleeves | pleated_skirt | submachine_gun | white_background | white_thighhighs | black_footwear | black_skirt | closed_mouth | holding_gun | jacket | military_uniform | red_necktie | loafers | belt | blush | collared_shirt | full_body | standing |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------------------|:-------|:--------------|:---------------|:----------------|:-----------------|:-------------------|:-------------------|:-----------------|:--------------|:---------------|:--------------|:---------|:-------------------|:--------------|:----------|:-------|:--------|:-----------------|:------------|:-----------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
omerist/arabicReviews-ds-mini | ---
dataset_info:
features:
- name: source
dtype: string
- name: title
dtype: string
- name: content
dtype: string
- name: content_length
dtype: int64
splits:
- name: train
num_bytes: 11505614.4
num_examples: 3600
- name: validation
num_bytes: 1278401.6
num_examples: 400
download_size: 6325726
dataset_size: 12784016.0
---
# Dataset Card for "arabicReviews-ds-mini"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sanka85/rstp-llama-2 | ---
license: apache-2.0
---
|
AdapterOcean/data-standardized_cluster_20 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 87352890
num_examples: 8382
download_size: 25168511
dataset_size: 87352890
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data-standardized_cluster_20"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Aples/FineTune_Dataset_Aples_1K | ---
license: mit
---
|
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-_fil_self_160m_bo16_2_mix_50_kl_0.1_prm_70m_thr_0.1_seed_2 | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: index
dtype: int64
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43740541
num_examples: 18928
- name: epoch_1
num_bytes: 44339568
num_examples: 18928
- name: epoch_2
num_bytes: 44401936
num_examples: 18928
- name: epoch_3
num_bytes: 44435958
num_examples: 18928
- name: epoch_4
num_bytes: 44449446
num_examples: 18928
- name: epoch_5
num_bytes: 44430245
num_examples: 18928
- name: epoch_6
num_bytes: 44416729
num_examples: 18928
- name: epoch_7
num_bytes: 44400833
num_examples: 18928
- name: epoch_8
num_bytes: 44391591
num_examples: 18928
- name: epoch_9
num_bytes: 44385508
num_examples: 18928
- name: epoch_10
num_bytes: 44384271
num_examples: 18928
- name: epoch_11
num_bytes: 44383257
num_examples: 18928
- name: epoch_12
num_bytes: 44381820
num_examples: 18928
- name: epoch_13
num_bytes: 44382190
num_examples: 18928
- name: epoch_14
num_bytes: 44381849
num_examples: 18928
- name: epoch_15
num_bytes: 44380290
num_examples: 18928
- name: epoch_16
num_bytes: 44381472
num_examples: 18928
- name: epoch_17
num_bytes: 44379735
num_examples: 18928
- name: epoch_18
num_bytes: 44380215
num_examples: 18928
- name: epoch_19
num_bytes: 44379890
num_examples: 18928
- name: epoch_20
num_bytes: 44379927
num_examples: 18928
- name: epoch_21
num_bytes: 44379079
num_examples: 18928
- name: epoch_22
num_bytes: 44379752
num_examples: 18928
- name: epoch_23
num_bytes: 44378240
num_examples: 18928
- name: epoch_24
num_bytes: 44380122
num_examples: 18928
- name: epoch_25
num_bytes: 44379598
num_examples: 18928
- name: epoch_26
num_bytes: 44379422
num_examples: 18928
- name: epoch_27
num_bytes: 44379210
num_examples: 18928
- name: epoch_28
num_bytes: 44379751
num_examples: 18928
- name: epoch_29
num_bytes: 44378764
num_examples: 18928
download_size: 700599864
dataset_size: 1331001209
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
- split: epoch_10
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-*
- split: epoch_20
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-*
---
|
Azure99/blossom-wizard-v2 | ---
license: apache-2.0
task_categories:
- text-generation
- text2text-generation
language:
- zh
- en
size_categories:
- 100K<n<1M
---
# BLOSSOM WIZARD V2
### 介绍
Blossom Wizard V2是一个基于WizardLM_evol_instruct_V2衍生而来的中英双语指令数据集,适用于指令微调。
相比于blossom-wizard-v1,指令不变,进一步优化了输出效果。
本数据集从WizardLM_evol_instruct_V2中抽取了指令,首先将其翻译为中文并校验翻译结果,再使用指令调用gpt-3.5-turbo-0613模型生成响应,并过滤掉包含自我认知以及拒绝回答的响应,以便后续对齐。此外,为了确保响应风格的一致性以及中英数据配比,本数据集还对未翻译的原始指令也进行了相同的调用,最终得到了1:1的中英双语指令数据。
相比直接对原始Wizard进行翻译的中文数据集,Blossom Wizard的一致性及质量更高。
本次发布了全量数据的30%,包含中英双语各50K,共计100K记录。
### 语言
以中文和英文为主。
### 数据集结构
数据集包含两个文件:blossom-wizard-v1-chinese-50k.json和blossom-wizard-v1-english-50k.json,分别对应中文和英文的数据。
每条数据代表一个完整的对话,包含id和conversations两个字段。
- id:字符串,代表原始WizardLM_evol_instruct_V2的指令id。
- conversations:对象数组,每个对象包含role、content两个字段,role的取值为user或assistant,分别代表用户输入和助手输出,content则为对应的内容。
### 数据集限制
本数据集的所有响应均由gpt-3.5-turbo-0613生成,并未经过严格的数据校验,可能包含不准确甚至严重错误的回答。此外,由于过滤了拒答响应,仅使用本数据集训练的模型,可能不会拒绝非法的请求。 |
AlekseyKorshuk/airoboros-chatml | ---
dataset_info:
features:
- name: conversation
list:
- name: content
dtype: string
- name: do_train
dtype: bool
- name: role
dtype: string
splits:
- name: train
num_bytes: 187514793
num_examples: 118142
download_size: 104063685
dataset_size: 187514793
---
# Dataset Card for "airoboros-chatml"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/find_sent_after_sent_train_400_eval_40_random_permute_8 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 7716633.701377225
num_examples: 5514
- name: validation
num_bytes: 232483
num_examples: 200
download_size: 1305162
dataset_size: 7949116.701377225
---
# Dataset Card for "find_sent_after_sent_train_400_eval_40_random_permute_8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
derchr/py | ---
license: bigscience-openrail-m
---
<h1>This dataset is used to train AI how to use python.</h1> |
ibranze/araproje_hellaswag_en_s3 | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 149738.0
num_examples: 250
download_size: 82715
dataset_size: 149738.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_en_s3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_brucethemoose__CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties | ---
pretty_name: Evaluation run of brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties](https://huggingface.co/brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_brucethemoose__CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-10T10:23:58.856045](https://huggingface.co/datasets/open-llm-leaderboard/details_brucethemoose__CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties/blob/main/results_2023-12-10T10-23-58.856045.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7470453642761706,\n\
\ \"acc_stderr\": 0.028619765288934736,\n \"acc_norm\": 0.7535136424409922,\n\
\ \"acc_norm_stderr\": 0.02914053190348252,\n \"mc1\": 0.3806609547123623,\n\
\ \"mc1_stderr\": 0.01699762787190793,\n \"mc2\": 0.5283809284788162,\n\
\ \"mc2_stderr\": 0.01556812706457422\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6245733788395904,\n \"acc_stderr\": 0.014150631435111726,\n\
\ \"acc_norm\": 0.6493174061433447,\n \"acc_norm_stderr\": 0.013944635930726096\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6541525592511452,\n\
\ \"acc_stderr\": 0.0047467168057357635,\n \"acc_norm\": 0.8499302927703645,\n\
\ \"acc_norm_stderr\": 0.003564098420387773\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7333333333333333,\n\
\ \"acc_stderr\": 0.038201699145179055,\n \"acc_norm\": 0.7333333333333333,\n\
\ \"acc_norm_stderr\": 0.038201699145179055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.875,\n \"acc_stderr\": 0.026913523521537846,\n \
\ \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.026913523521537846\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7849056603773585,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.7849056603773585,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9097222222222222,\n\
\ \"acc_stderr\": 0.023964965777906935,\n \"acc_norm\": 0.9097222222222222,\n\
\ \"acc_norm_stderr\": 0.023964965777906935\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.61,\n \"acc_stderr\": 0.049020713000019756,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.049020713000019756\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7572254335260116,\n\
\ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.7572254335260116,\n\
\ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.04951218252396262,\n\
\ \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.04951218252396262\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7787234042553192,\n \"acc_stderr\": 0.027136349602424056,\n\
\ \"acc_norm\": 0.7787234042553192,\n \"acc_norm_stderr\": 0.027136349602424056\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5701754385964912,\n\
\ \"acc_stderr\": 0.04657047260594964,\n \"acc_norm\": 0.5701754385964912,\n\
\ \"acc_norm_stderr\": 0.04657047260594964\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7448275862068966,\n \"acc_stderr\": 0.03632984052707842,\n\
\ \"acc_norm\": 0.7448275862068966,\n \"acc_norm_stderr\": 0.03632984052707842\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.6587301587301587,\n \"acc_stderr\": 0.02441923496681907,\n \"\
acc_norm\": 0.6587301587301587,\n \"acc_norm_stderr\": 0.02441923496681907\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8903225806451613,\n \"acc_stderr\": 0.017776778700485177,\n \"\
acc_norm\": 0.8903225806451613,\n \"acc_norm_stderr\": 0.017776778700485177\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.6650246305418719,\n \"acc_stderr\": 0.033208527423483104,\n \"\
acc_norm\": 0.6650246305418719,\n \"acc_norm_stderr\": 0.033208527423483104\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\"\
: 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865394,\n\
\ \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865394\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9141414141414141,\n \"acc_stderr\": 0.01996022556317289,\n \"\
acc_norm\": 0.9141414141414141,\n \"acc_norm_stderr\": 0.01996022556317289\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527048,\n\
\ \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527048\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7846153846153846,\n \"acc_stderr\": 0.020843034557462878,\n\
\ \"acc_norm\": 0.7846153846153846,\n \"acc_norm_stderr\": 0.020843034557462878\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.42962962962962964,\n \"acc_stderr\": 0.030182099804387262,\n \
\ \"acc_norm\": 0.42962962962962964,\n \"acc_norm_stderr\": 0.030182099804387262\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.865546218487395,\n \"acc_stderr\": 0.022159373072744442,\n \
\ \"acc_norm\": 0.865546218487395,\n \"acc_norm_stderr\": 0.022159373072744442\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4503311258278146,\n \"acc_stderr\": 0.040622900186837764,\n \"\
acc_norm\": 0.4503311258278146,\n \"acc_norm_stderr\": 0.040622900186837764\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9211009174311927,\n \"acc_stderr\": 0.011558198113769567,\n \"\
acc_norm\": 0.9211009174311927,\n \"acc_norm_stderr\": 0.011558198113769567\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6574074074074074,\n \"acc_stderr\": 0.032365852526021574,\n \"\
acc_norm\": 0.6574074074074074,\n \"acc_norm_stderr\": 0.032365852526021574\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316945,\n \"\
acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316945\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9156118143459916,\n \"acc_stderr\": 0.01809424711647332,\n \
\ \"acc_norm\": 0.9156118143459916,\n \"acc_norm_stderr\": 0.01809424711647332\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7802690582959642,\n\
\ \"acc_stderr\": 0.027790177064383595,\n \"acc_norm\": 0.7802690582959642,\n\
\ \"acc_norm_stderr\": 0.027790177064383595\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.03217829420744632,\n\
\ \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.03217829420744632\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540627,\n \"\
acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540627\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n\
\ \"acc_stderr\": 0.03434300243631,\n \"acc_norm\": 0.8518518518518519,\n\
\ \"acc_norm_stderr\": 0.03434300243631\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.852760736196319,\n \"acc_stderr\": 0.02783991527833965,\n\
\ \"acc_norm\": 0.852760736196319,\n \"acc_norm_stderr\": 0.02783991527833965\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.034926064766237906,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.034926064766237906\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9102564102564102,\n\
\ \"acc_stderr\": 0.01872430174194165,\n \"acc_norm\": 0.9102564102564102,\n\
\ \"acc_norm_stderr\": 0.01872430174194165\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9016602809706258,\n\
\ \"acc_stderr\": 0.01064835630187633,\n \"acc_norm\": 0.9016602809706258,\n\
\ \"acc_norm_stderr\": 0.01064835630187633\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8208092485549133,\n \"acc_stderr\": 0.020647590029679332,\n\
\ \"acc_norm\": 0.8208092485549133,\n \"acc_norm_stderr\": 0.020647590029679332\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7284916201117319,\n\
\ \"acc_stderr\": 0.014874252168095268,\n \"acc_norm\": 0.7284916201117319,\n\
\ \"acc_norm_stderr\": 0.014874252168095268\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.021828596053108402,\n\
\ \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.021828596053108402\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8070739549839229,\n\
\ \"acc_stderr\": 0.022411516780911366,\n \"acc_norm\": 0.8070739549839229,\n\
\ \"acc_norm_stderr\": 0.022411516780911366\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8487654320987654,\n \"acc_stderr\": 0.01993508609214988,\n\
\ \"acc_norm\": 0.8487654320987654,\n \"acc_norm_stderr\": 0.01993508609214988\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6382978723404256,\n \"acc_stderr\": 0.028663820147199485,\n \
\ \"acc_norm\": 0.6382978723404256,\n \"acc_norm_stderr\": 0.028663820147199485\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5893089960886571,\n\
\ \"acc_stderr\": 0.012564871542534356,\n \"acc_norm\": 0.5893089960886571,\n\
\ \"acc_norm_stderr\": 0.012564871542534356\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8272058823529411,\n \"acc_stderr\": 0.022966067585581795,\n\
\ \"acc_norm\": 0.8272058823529411,\n \"acc_norm_stderr\": 0.022966067585581795\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.815359477124183,\n \"acc_stderr\": 0.01569702924075778,\n \
\ \"acc_norm\": 0.815359477124183,\n \"acc_norm_stderr\": 0.01569702924075778\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7636363636363637,\n\
\ \"acc_stderr\": 0.04069306319721376,\n \"acc_norm\": 0.7636363636363637,\n\
\ \"acc_norm_stderr\": 0.04069306319721376\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8448979591836735,\n \"acc_stderr\": 0.0231747988612186,\n\
\ \"acc_norm\": 0.8448979591836735,\n \"acc_norm_stderr\": 0.0231747988612186\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n\
\ \"acc_stderr\": 0.021166216304659386,\n \"acc_norm\": 0.900497512437811,\n\
\ \"acc_norm_stderr\": 0.021166216304659386\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.93,\n \"acc_stderr\": 0.0256432399976243,\n \
\ \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.0256432399976243\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366152,\n\
\ \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366152\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3806609547123623,\n\
\ \"mc1_stderr\": 0.01699762787190793,\n \"mc2\": 0.5283809284788162,\n\
\ \"mc2_stderr\": 0.01556812706457422\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7924230465666929,\n \"acc_stderr\": 0.011398593419386776\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5405610310841547,\n \
\ \"acc_stderr\": 0.013727093010429785\n }\n}\n```"
repo_url: https://huggingface.co/brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|arc:challenge|25_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|gsm8k|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hellaswag|10_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T10-23-58.856045.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-10T10-23-58.856045.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- '**/details_harness|winogrande|5_2023-12-10T10-23-58.856045.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-10T10-23-58.856045.parquet'
- config_name: results
data_files:
- split: 2023_12_10T10_23_58.856045
path:
- results_2023-12-10T10-23-58.856045.parquet
- split: latest
path:
- results_2023-12-10T10-23-58.856045.parquet
---
# Dataset Card for Evaluation run of brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties](https://huggingface.co/brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_brucethemoose__CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T10:23:58.856045](https://huggingface.co/datasets/open-llm-leaderboard/details_brucethemoose__CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties/blob/main/results_2023-12-10T10-23-58.856045.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7470453642761706,
"acc_stderr": 0.028619765288934736,
"acc_norm": 0.7535136424409922,
"acc_norm_stderr": 0.02914053190348252,
"mc1": 0.3806609547123623,
"mc1_stderr": 0.01699762787190793,
"mc2": 0.5283809284788162,
"mc2_stderr": 0.01556812706457422
},
"harness|arc:challenge|25": {
"acc": 0.6245733788395904,
"acc_stderr": 0.014150631435111726,
"acc_norm": 0.6493174061433447,
"acc_norm_stderr": 0.013944635930726096
},
"harness|hellaswag|10": {
"acc": 0.6541525592511452,
"acc_stderr": 0.0047467168057357635,
"acc_norm": 0.8499302927703645,
"acc_norm_stderr": 0.003564098420387773
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.038201699145179055,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.038201699145179055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.875,
"acc_stderr": 0.026913523521537846,
"acc_norm": 0.875,
"acc_norm_stderr": 0.026913523521537846
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7849056603773585,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.7849056603773585,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9097222222222222,
"acc_stderr": 0.023964965777906935,
"acc_norm": 0.9097222222222222,
"acc_norm_stderr": 0.023964965777906935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.049020713000019756,
"acc_norm": 0.61,
"acc_norm_stderr": 0.049020713000019756
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7572254335260116,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.7572254335260116,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.04951218252396262,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.04951218252396262
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7787234042553192,
"acc_stderr": 0.027136349602424056,
"acc_norm": 0.7787234042553192,
"acc_norm_stderr": 0.027136349602424056
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5701754385964912,
"acc_stderr": 0.04657047260594964,
"acc_norm": 0.5701754385964912,
"acc_norm_stderr": 0.04657047260594964
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7448275862068966,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.7448275862068966,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6587301587301587,
"acc_stderr": 0.02441923496681907,
"acc_norm": 0.6587301587301587,
"acc_norm_stderr": 0.02441923496681907
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8903225806451613,
"acc_stderr": 0.017776778700485177,
"acc_norm": 0.8903225806451613,
"acc_norm_stderr": 0.017776778700485177
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6650246305418719,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.6650246305418719,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.027045948825865394,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.027045948825865394
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9141414141414141,
"acc_stderr": 0.01996022556317289,
"acc_norm": 0.9141414141414141,
"acc_norm_stderr": 0.01996022556317289
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527048,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527048
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7846153846153846,
"acc_stderr": 0.020843034557462878,
"acc_norm": 0.7846153846153846,
"acc_norm_stderr": 0.020843034557462878
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.030182099804387262,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.030182099804387262
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.865546218487395,
"acc_stderr": 0.022159373072744442,
"acc_norm": 0.865546218487395,
"acc_norm_stderr": 0.022159373072744442
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4503311258278146,
"acc_stderr": 0.040622900186837764,
"acc_norm": 0.4503311258278146,
"acc_norm_stderr": 0.040622900186837764
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9211009174311927,
"acc_stderr": 0.011558198113769567,
"acc_norm": 0.9211009174311927,
"acc_norm_stderr": 0.011558198113769567
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.032365852526021574,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.032365852526021574
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316945,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316945
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9156118143459916,
"acc_stderr": 0.01809424711647332,
"acc_norm": 0.9156118143459916,
"acc_norm_stderr": 0.01809424711647332
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7802690582959642,
"acc_stderr": 0.027790177064383595,
"acc_norm": 0.7802690582959642,
"acc_norm_stderr": 0.027790177064383595
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.03217829420744632,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.03217829420744632
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.028268812192540627,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.028268812192540627
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243631,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243631
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.852760736196319,
"acc_stderr": 0.02783991527833965,
"acc_norm": 0.852760736196319,
"acc_norm_stderr": 0.02783991527833965
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.034926064766237906,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.034926064766237906
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9102564102564102,
"acc_stderr": 0.01872430174194165,
"acc_norm": 0.9102564102564102,
"acc_norm_stderr": 0.01872430174194165
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9016602809706258,
"acc_stderr": 0.01064835630187633,
"acc_norm": 0.9016602809706258,
"acc_norm_stderr": 0.01064835630187633
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8208092485549133,
"acc_stderr": 0.020647590029679332,
"acc_norm": 0.8208092485549133,
"acc_norm_stderr": 0.020647590029679332
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7284916201117319,
"acc_stderr": 0.014874252168095268,
"acc_norm": 0.7284916201117319,
"acc_norm_stderr": 0.014874252168095268
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.021828596053108402,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.021828596053108402
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8070739549839229,
"acc_stderr": 0.022411516780911366,
"acc_norm": 0.8070739549839229,
"acc_norm_stderr": 0.022411516780911366
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8487654320987654,
"acc_stderr": 0.01993508609214988,
"acc_norm": 0.8487654320987654,
"acc_norm_stderr": 0.01993508609214988
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6382978723404256,
"acc_stderr": 0.028663820147199485,
"acc_norm": 0.6382978723404256,
"acc_norm_stderr": 0.028663820147199485
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5893089960886571,
"acc_stderr": 0.012564871542534356,
"acc_norm": 0.5893089960886571,
"acc_norm_stderr": 0.012564871542534356
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8272058823529411,
"acc_stderr": 0.022966067585581795,
"acc_norm": 0.8272058823529411,
"acc_norm_stderr": 0.022966067585581795
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.815359477124183,
"acc_stderr": 0.01569702924075778,
"acc_norm": 0.815359477124183,
"acc_norm_stderr": 0.01569702924075778
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.04069306319721376,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.04069306319721376
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8448979591836735,
"acc_stderr": 0.0231747988612186,
"acc_norm": 0.8448979591836735,
"acc_norm_stderr": 0.0231747988612186
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.021166216304659386,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.021166216304659386
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.0256432399976243,
"acc_norm": 0.93,
"acc_norm_stderr": 0.0256432399976243
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.024648068961366152,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.024648068961366152
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3806609547123623,
"mc1_stderr": 0.01699762787190793,
"mc2": 0.5283809284788162,
"mc2_stderr": 0.01556812706457422
},
"harness|winogrande|5": {
"acc": 0.7924230465666929,
"acc_stderr": 0.011398593419386776
},
"harness|gsm8k|5": {
"acc": 0.5405610310841547,
"acc_stderr": 0.013727093010429785
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
autoevaluate/autoeval-staging-eval-cnn_dailymail-3.0.0-5863f2-15966190 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cnn_dailymail
eval_info:
task: summarization
model: SamuelAllen123/t5-efficient-large-nl36_fine_tuned_for_sum
metrics: ['rouge', 'accuracy']
dataset_name: cnn_dailymail
dataset_config: 3.0.0
dataset_split: test
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: SamuelAllen123/t5-efficient-large-nl36_fine_tuned_for_sum
* Dataset: cnn_dailymail
* Config: 3.0.0
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@samuelallen123](https://huggingface.co/samuelallen123) for evaluating this model. |
huggingnft/cryptoskulls | ---
tags:
- huggingnft
- nft
- huggan
- gan
- image
- images
task:
- unconditional-image-generation
datasets:
- huggingnft/cryptoskulls
license: mit
---
# Dataset Card
## Disclaimer
All rights belong to their owners.
Models and datasets can be removed from the site at the request of the copyright holder.
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingnft](https://github.com/AlekseyKorshuk/huggingnft)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingnft](https://github.com/AlekseyKorshuk/huggingnft)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Dataset Summary
NFT images dataset for unconditional generation.
NFT collection available [here](https://opensea.io/collection/cryptoskulls).
Model is available [here](https://huggingface.co/huggingnft/cryptoskulls).
Check Space: [link](https://huggingface.co/spaces/AlekseyKorshuk/huggingnft).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingnft/cryptoskulls")
```
## Dataset Structure
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Data Fields
The data fields are the same among all splits.
- `image`: an `image` feature.
- `id`: an `int` feature.
- `token_metadata`: a `str` feature.
- `image_original_url`: a `str` feature.
### Data Splits
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingnft,
author={Aleksey Korshuk}
year=2022
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingnft)
|
caldervf/cicero_dataset_with_summaries | ---
dataset_info:
features:
- name: _id
dtype: string
- name: title
dtype: string
- name: content
dtype: string
- name: summary
dtype: string
- name: content_filtered
dtype: string
splits:
- name: train
num_bytes: 15763532
num_examples: 1143
download_size: 0
dataset_size: 15763532
---
# Dataset Card for "cicero_dataset_with_summaries"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
varunr14/text_2_prompt | ---
license: unknown
---
|
bdsaglam/musique-jerx-sft-mt-ss-openai | ---
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 97827
num_examples: 58
download_size: 32030
dataset_size: 97827
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
peisuke/hh-rlhf | ---
license: mit
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 205102384
num_examples: 160800
- name: test
num_bytes: 10978491
num_examples: 8552
download_size: 127661326
dataset_size: 216080875
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
August4293/Preference-Dataset | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: rejected
dtype: string
- name: chosen
dtype: string
splits:
- name: train
num_bytes: 8024886
num_examples: 4449
download_size: 4200748
dataset_size: 8024886
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- text-generation
- question-answering
language:
- en
pretty_name: Mistral 7b Preference Dataset
size_categories:
- 1K<n<10K
---
# Mistral Self-Alignment Preference Dataset
**Warning: This dataset contains harmful and offensive data! Proceed with caution.**
The Mistral Self-Alignment Preference Dataset was generated by Mistral 7b using the Anthropics Red Teaming Prompts dataset available at [Hugging Face - Anthropics Red Teaming Prompts Dataset](https://huggingface.co/datasets/Anthropic/hh-rlhf). The data generation process utilized the Preference Data Generation Notebook, which can be found [here](https://github.com/August-murr/Lab/blob/main/Mistral%20Self%20Alignment/Notebooks/preference-dataset-generation.ipynb).
The purpose of this dataset is to facilitate self-alignment, as explained in detail on the corresponding [GitHub page](https://github.com/August-murr/Lab/tree/main/Mistral%20Self%20Alignment).
## Dataset Details:
- **Source:** [Anthropics Red Teaming Prompts Dataset](https://huggingface.co/datasets/Anthropic/hh-rlhf)
- **Generated by:** [Mistral 7b](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1)
- **Purpose:** Self-Alignment
- **Usage:** Alignment and Evaluation
|
CyberHarem/qiqi_genshin | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of qiqi/七七/七七 (Genshin Impact)
This is the dataset of qiqi/七七/七七 (Genshin Impact), containing 500 images and their tags.
The core tags of this character are `purple_hair, purple_eyes, hair_ornament, hat, long_hair, coin_hair_ornament, braid, purple_headwear, hair_between_eyes, single_braid, braided_ponytail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 943.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/qiqi_genshin/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 791.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/qiqi_genshin/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1334 | 1.62 GiB | [Download](https://huggingface.co/datasets/CyberHarem/qiqi_genshin/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/qiqi_genshin',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, bead_necklace, jiangshi, long_sleeves, looking_at_viewer, ofuda, open_mouth, qingdai_guanmao, solo, dress, simple_background, white_background, wide_sleeves, :o, chinese_clothes, nail_polish, black_nails, earrings, upper_body, vision_(genshin_impact), white_thighhighs |
| 1 | 9 |  |  |  |  |  | 1girl, bead_necklace, dress, jiangshi, long_sleeves, looking_at_viewer, ofuda, qingdai_guanmao, solo, white_background, white_thighhighs, simple_background, wide_sleeves, vision_(genshin_impact), parted_lips, bandaged_leg, shorts |
| 2 | 9 |  |  |  |  |  | 1girl, bead_necklace, jiangshi, long_sleeves, ofuda, qingdai_guanmao, solo, white_thighhighs, wide_sleeves, chinese_clothes, looking_at_viewer, sidelocks, dress, bandaged_leg, vision_(genshin_impact), parted_lips, yin_yang_orb |
| 3 | 5 |  |  |  |  |  | 1girl, bead_necklace, chinese_clothes, jiangshi, long_sleeves, looking_at_viewer, ofuda, qingdai_guanmao, sidelocks, solo, wide_sleeves, low_ponytail, simple_background, vision_(genshin_impact), white_background, cape, dress, orb, parted_lips, yin_yang |
| 4 | 7 |  |  |  |  |  | 1girl, bead_necklace, holding_sword, jiangshi, long_sleeves, ofuda, qingdai_guanmao, solo, wide_sleeves, chinese_clothes, looking_at_viewer, cape, orb, sidelocks, snowflakes, white_thighhighs, yin_yang, shorts |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bead_necklace | jiangshi | long_sleeves | looking_at_viewer | ofuda | open_mouth | qingdai_guanmao | solo | dress | simple_background | white_background | wide_sleeves | :o | chinese_clothes | nail_polish | black_nails | earrings | upper_body | vision_(genshin_impact) | white_thighhighs | parted_lips | bandaged_leg | shorts | sidelocks | yin_yang_orb | low_ponytail | cape | orb | yin_yang | holding_sword | snowflakes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------------|:-----------|:---------------|:--------------------|:--------|:-------------|:------------------|:-------|:--------|:--------------------|:-------------------|:---------------|:-----|:------------------|:--------------|:--------------|:-----------|:-------------|:--------------------------|:-------------------|:--------------|:---------------|:---------|:------------|:---------------|:---------------|:-------|:------|:-----------|:----------------|:-------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | X | X | X | X | | X | X | X | X | X | X | | | | | | | X | X | X | X | X | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | X | X | X | X | | X | X | X | | | X | | X | | | | | X | X | X | X | | X | X | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | X | X | X | X | | X | X | X | X | X | X | | X | | | | | X | | X | | | X | | X | X | X | X | | |
| 4 | 7 |  |  |  |  |  | X | X | X | X | X | X | | X | X | | | | X | | X | | | | | | X | | | X | X | | | X | X | X | X | X |
|
ahmadSiddiqi/amazon_reviews_fr | ---
dataset_info:
features:
- name: text
dtype: string
- name: label_text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 25767910.0
num_examples: 98000
- name: test
num_bytes: 11043390.0
num_examples: 42000
download_size: 17155331
dataset_size: 36811300.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
weqweasdas/zephyr_ultra_feedback_model1 | ---
configs:
- config_name: default
data_files:
- split: ds1
path: data/ds1-*
dataset_info:
features:
- name: prompt
dtype: string
- name: rewards
sequence: float64
- name: kl
sequence: float64
- name: responses
sequence: string
splits:
- name: ds1
num_bytes: 117318848
num_examples: 7500
download_size: 55790060
dataset_size: 117318848
---
# Dataset Card for "zephyr_ultra_feedback_model1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
adityarra07/train_5000 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 666393486.3189659
num_examples: 5000
- name: test
num_bytes: 26655739.452758636
num_examples: 200
download_size: 683677183
dataset_size: 693049225.7717246
---
# Dataset Card for "train_5000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ttxy/resume_ner | ---
language:
- code
pretty_name: "resume ner dataseet"
tags:
- ner
license: "bsd"
task_categories:
- token-classification
---
中文 resume ner 数据集, 来源: https://github.com/luopeixiang/named_entity_recognition 。
数据的格式如下,它的每一行由一个字及其对应的标注组成,标注集采用BIOES,句子之间用一个空行隔开。
```text
美 B-LOC
国 E-LOC
的 O
华 B-PER
莱 I-PER
士 E-PER
我 O
跟 O
他 O
谈 O
笑 O
风 O
生 O
```
# 效果
## 不同模型的效果对比:
<img src="https://file.ddot.cc/imagehost/2023/8bb93212-5812-4211-91b8-7a6bda841e1b.png">
## Bert-tiny 结果
|model | precision | recall | f1-score | support |
|---|---|---|---|---|
|BERT-tiny | 0.9490 | 0.9538 | 0.9447 | 全部 |
|BERT-tiny | 0.9278 | 0.9251 | 0.9313 | 使用 100 train |
注:
- 后面再测试,BERT-tiny(softmax) + 100 训练样本,暂时没有复现 0.9313 的结果,最好结果 0.8612
- BERT-tiny + LSTM(softmax) + 100 样本,`val_f1` 可达 0.8737
|
vwxyzjn/summarize_from_feedback_oai_preprocessing_1706381144 | ---
dataset_info:
features:
- name: info
struct:
- name: id
dtype: string
- name: post
dtype: string
- name: title
dtype: string
- name: subreddit
dtype: string
- name: site
dtype: string
- name: article
dtype: string
- name: summaries
list:
- name: text
dtype: string
- name: policy
dtype: string
- name: note
dtype: string
- name: choice
dtype: int32
- name: worker
dtype: string
- name: batch
dtype: string
- name: split
dtype: string
- name: extra
struct:
- name: confidence
dtype: int32
- name: query_token
sequence: int64
- name: query
dtype: string
- name: chosen
dtype: string
- name: chosen_token
sequence: int64
- name: chosen_token_len
dtype: int64
- name: rejected
dtype: string
- name: rejected_token
sequence: int64
- name: rejected_token_len
dtype: int64
- name: chosen_policy
dtype: string
- name: rejected_policy
dtype: string
- name: policies
dtype: string
- name: query_chosen
dtype: string
- name: query_chosen_token
sequence: int64
- name: query_chosen_token_len
dtype: int64
- name: query_rejected
dtype: string
- name: query_rejected_token
sequence: int64
- name: query_rejected_token_len
dtype: int64
- name: query_token_len
dtype: int64
- name: query_chosen_token_response_label
sequence: int64
- name: query_rejected_token_response_label
sequence: int64
splits:
- name: train
num_bytes: 3159944659
num_examples: 92858
- name: validation
num_bytes: 2859307359
num_examples: 83802
- name: validation_cnndm
num_bytes: 225356751
num_examples: 2284
download_size: 290785403
dataset_size: 6244608769
---
# Dataset Card for "summarize_from_feedback_oai_preprocessing_1706381144"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EleutherAI/drop | ---
license: cc-by-4.0
--- |
Yinxing/LLM_Dataset | ---
license: mit
---
|
thatbrowngirl/tamilReview-ds-mini | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: review
sequence: string
- name: review_length
dtype: int64
splits:
- name: train
num_bytes: 973458.45725
num_examples: 3473
- name: validation
num_bytes: 108193.1945
num_examples: 386
download_size: 0
dataset_size: 1081651.65175
---
# Dataset Card for "tamilReview-ds-mini"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
claudehotline/my_dataset | ---
dataset_info:
features:
- name: data
dtype: float64
splits:
- name: train
num_bytes: 80000
num_examples: 10000
download_size: 96279
dataset_size: 80000
---
# Dataset Card for "my_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
worldboss/bank-of-ghana-rates | ---
language:
- en
license: apache-2.0
size_categories:
- 100K<n<1M
task_categories:
- conversational
- text-generation
- summarization
- question-answering
- text-classification
- text-retrieval
- translation
pretty_name: No Robots
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: Date
dtype: string
- name: Currency
dtype: string
- name: Currency_Pair
dtype: string
- name: Buying
dtype: string
- name: Selling
dtype: string
- name: Mid_Rate
dtype: string
splits:
- name: train
num_bytes: 8628801
num_examples: 132525
download_size: 2273117
dataset_size: 8628801
tags:
- ghana
- news
- ghana-news
- bank-of-ghana
- exchange-rates
- ghana data
---
### Description 🙅♂️🤖
Bank of Ghana historical and real-time exchange rates data. [Bank of Ghana](https://www.bog.gov.gh/treasury-and-the-markets/historical-interbank-fx-rates/)
Click Here:[](https://colab.research.google.com/drive/1zZUIyp9zBhwL5CqHS3Ggf5vJCr_yTYw0?usp=sharing)
### Data Format
```shell
{
"date": "...",
"currency": "...",
"currency_pair": "...",
"buying": "...",
"selling": "...",
"mid_rate": "..."
}
```
### Load Dataset
```shell
pip install datasets
```
```python
from datasets import load_dataset
rates = load_dataset("worldboss/bank-of-ghana-rates", split="train")
pd.DataFrame(rates).head()
```
### Author
The data was constructed by Theophilus Siameh (theodondre@gmail.com). |
CyberHarem/prinz_rupprecht_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of prinz_rupprecht/プリンツ・ループレヒト/鲁普雷希特亲王 (Azur Lane)
This is the dataset of prinz_rupprecht/プリンツ・ループレヒト/鲁普雷希特亲王 (Azur Lane), containing 98 images and their tags.
The core tags of this character are `long_hair, pink_hair, breasts, purple_eyes, horns, hair_on_horn, hair_over_one_eye, medium_breasts, very_long_hair, hairband, mechanical_horns`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 98 | 185.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/prinz_rupprecht_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 98 | 86.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/prinz_rupprecht_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 260 | 198.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/prinz_rupprecht_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 98 | 154.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/prinz_rupprecht_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 260 | 305.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/prinz_rupprecht_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/prinz_rupprecht_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, iron_cross, looking_at_viewer, simple_background, solo, white_background, belt, black_dress, smile, long_sleeves, off_shoulder, antenna_hair, bangs, blush, open_mouth, sleeves_past_fingers |
| 1 | 19 |  |  |  |  |  | 1girl, iron_cross, solo, white_thighhighs, sleeves_past_fingers, bare_shoulders, detached_collar, looking_at_viewer, cleavage, black_dress, black_footwear, long_sleeves, belt, full_body, simple_background, bangs, black_horns, high_heels, off_shoulder, smile, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | cleavage | iron_cross | looking_at_viewer | simple_background | solo | white_background | belt | black_dress | smile | long_sleeves | off_shoulder | antenna_hair | bangs | blush | open_mouth | sleeves_past_fingers | white_thighhighs | detached_collar | black_footwear | full_body | black_horns | high_heels |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-----------|:-------------|:--------------------|:--------------------|:-------|:-------------------|:-------|:--------------|:--------|:---------------|:---------------|:---------------|:--------|:--------|:-------------|:-----------------------|:-------------------|:------------------|:-----------------|:------------|:--------------|:-------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 1 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | X | | | X | X | X | X | X | X | X |
|
ittailup/dictamenes_consejoestado_es | ---
dataset_info:
features:
- name: items
struct:
- name: text
dtype: string
splits:
- name: train
num_bytes: 866824944
num_examples: 101
download_size: 376824021
dataset_size: 866824944
---
# Dataset Card for "dictamenes_consejoestado_es"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vigneshgs7/Boundary_detection_Doc_7 | ---
dataset_info:
features:
- name: name
dtype: string
- name: uuid
dtype: string
- name: status
dtype: string
- name: image
dtype: image
- name: label.annotations
list:
- name: id
dtype: int32
- name: category_id
dtype: int32
- name: label.segmentation_bitmap
dtype: image
splits:
- name: train
num_bytes: 15325018761.0
num_examples: 308
download_size: 1012812048
dataset_size: 15325018761.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jamestalentium/dialogsum_1000_test | ---
dataset_info:
features:
- name: id
dtype: string
- name: input_text
dtype: string
- name: output_text
dtype: string
- name: topic
dtype: string
splits:
- name: test
num_bytes: 1353776.49
num_examples: 1485
download_size: 328916
dataset_size: 1353776.49
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "dialogsum_1000_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Kayvane/dreambooth-hackathon-rick-and-morty-images-2 | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 3482571.0
num_examples: 24
download_size: 3481016
dataset_size: 3482571.0
---
# Dataset Card for "dreambooth-hackathon-rick-and-morty-images-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
giux78/ultrafeedback-binarized-preferences-cleaned-ita | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train_prefs
num_bytes: 374051096
num_examples: 54810
- name: test_prefs
num_bytes: 41685745
num_examples: 6090
download_size: 201968877
dataset_size: 415736841
configs:
- config_name: default
data_files:
- split: train_prefs
path: data/train_prefs-*
- split: test_prefs
path: data/test_prefs-*
---
|
SHS/New_BioRED_Model | ---
dataset_info:
features:
- name: pmid
dtype: string
- name: passage
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence: string
splits:
- name: train
num_bytes: 752283
num_examples: 148
- name: val
num_bytes: 171371
num_examples: 33
- name: test
num_bytes: 160097
num_examples: 30
download_size: 392630
dataset_size: 1083751
---
# Dataset Card for "New_BioRED_Model"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KaifengGGG/WenYanWen_English_Parrallel | ---
license: mit
dataset_info:
features:
- name: info
dtype: string
- name: modern
dtype: string
- name: classical
dtype: string
- name: english
dtype: string
splits:
- name: train
num_bytes: 330211888.41295385
num_examples: 875220
- name: test
num_bytes: 36690335.58704614
num_examples: 97247
download_size: 259697443
dataset_size: 366902224.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
**Dataset Card for WenYanWen\_English\_Parallel**
**Dataset Summary**
The WenYanWen\_English\_Parallel dataset is a multilingual parallel corpus in Classical Chinese (Wenyanwen), modern Chinese, and English. The Classical Chinese and modern Chinese parts are sourced from the NiuTrans/Classical-Modern dataset, while the corresponding English translations are generated using Gemini Pro.
**Supported Tasks and Leaderboard**
This dataset can be used for various multilingual and translation tasks, including but not limited to:
1. Neural Machine Translation (Classical Chinese to Modern Chinese)
2. Neural Machine Translation (Modern Chinese to English)
3. Neural Machine Translation (Classical Chinese to English)
4. Multilingual Text-to-Text Transfer
There is currently no official leaderboard for this dataset.
**License**
Please refer to the license of the [NiuTrans/Classical-Modern](https://github.com/NiuTrans/Classical-Modern) dataset and the terms of use of Gemini Pro for more information regarding the dataset license.
**Citation Information**
If you use this dataset in your research, please cite the original sources:
1. [NiuTrans/Classical-Modern](https://github.com/NiuTrans/Classical-Modern)
2. Gemini Pro (for English translations)
**Dataset Structure**
The dataset is a tab-separated text file with four columns:
1. **document\_info**: The title or source information of the text
2. **modern\_chinese**: The translation of the original Classical Chinese text into modern Chinese
3. **classical\_chinese**: The original text in Classical Chinese (Wenyanwen)
4. **english**: The English translation of the Classical Chinese text
Here is an example of a dataset entry:
| Document Info | Modern Chinese | Classical Chinese | English Translation |
| --- | --- | --- | --- |
| 《黄帝四经·经法·道法》 | 人一降生便有患害随之,这是因为人的本性中存在着欲望且这种欲望永无止境。 | 生有害,曰欲,曰不知足。 | Man is born to know sorrow, because man's nature is selfish and his desires insatiable. |
**Dataset Size**
The dataset contains approximately 972k entries.
**Data Fields**
- **document\_info**: A string representing the title or source information of the text.
- **modern\_chinese**: A string containing the translation of the original Classical Chinese text into modern Chinese.
- **classical\_chinese**: A string containing the original text in Classical Chinese (Wenyanwen).
- **english**: A string containing the English translation of the Classical Chinese text.
**Data Splits**
There are no official data splits for this dataset. We recommend splitting the dataset into train, validation, and test sets at a ratio of 80:10:10.
**Potential Bias**
Since the English translations are generated using Gemini Pro, there might be inconsistencies or errors in the translations, which may introduce bias into the dataset. Additionally, the choice of Classical Chinese texts and their modern Chinese translations may also introduce bias. Finally, the use of a single translation tool for the English translations may result in limited linguistic diversity.
**Potential Social Impact**
This dataset can be used for various multilingual and translation tasks, which can have a positive impact on facilitating cross-cultural communication and understanding. However, it is important to be aware of the potential biases in the dataset and to use the dataset responsibly. Additionally, as with any dataset, it is important to consider the ethical implications of using this dataset, including issues related to data privacy, consent, and representation. |
Multimodal-Fatima/VQAv2_sample_validation_facebook_opt_2.7b_VQAv2_visclues_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_0_bs_8
num_bytes: 25492223
num_examples: 1000
download_size: 4915735
dataset_size: 25492223
---
# Dataset Card for "VQAv2_sample_validation_facebook_opt_2.7b_VQAv2_visclues_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Astonzzh/strategy_pred_v4 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
dataset_info:
features:
- name: short_dialog
sequence: string
- name: previous_summary
struct:
- name: dialog
dtype: string
- name: summary
dtype: string
- name: strategy
dtype: string
splits:
- name: train
num_bytes: 13134090.92251816
num_examples: 10572
- name: val
num_bytes: 1642382.5387409201
num_examples: 1322
- name: test
num_bytes: 1642382.5387409201
num_examples: 1322
download_size: 8871141
dataset_size: 16418856.0
---
# Dataset Card for "strategy_pred_v4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Seanxh/twitter_dataset_1713016716 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 343017
num_examples: 956
download_size: 128447
dataset_size: 343017
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/gabriel_tenma_white_gabrieldropout | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Gabriel Tenma White
This is the dataset of Gabriel Tenma White, containing 353 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 353 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 827 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 956 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 353 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 353 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 353 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 827 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 827 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 690 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 956 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 956 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
ssinha1039/Resume | ---
license: openrail
task_categories:
- text-classification
language:
- en
pretty_name: tiny_demo
size_categories:
- 1K<n<10K
--- |
Nexdata/64378_Images_Data_of_1073_Dogs_Noses | ---
license: cc-by-nc-nd-4.0
---
## Description
64,378 Images Data of 1,073 Dogs' Noses. The data includes indoor and outdoor scenes(the collection scene of the same dog didn't change). The data covers multiple dog types (such as Teddy, Labrador, Shiba Inu, etc.), and multiple lights. Segmentation annotation was done on the dog's nose. The data can be applied to dog face recognition, dog identification, etc.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1051?source=Huggingface
## Data size
1,073 dogs, 64,378 images
## Environment
multiple scenes, including indoor and outdoor scenes
## Diversity
multiple scenes, multiple dog types (such as Teddy, Labrador, ShibaInu, etc.), multiple lights
## Device
cellphone
## Image parameters
videos are in .mp4 or .MOV format, images are in .jpg format
## Collection content
collecting videos of the dog's head
## Annotation
segmentation annotation on the dog's nose (noseprint)
## Application scenarios
the dog's noseprint is analogous to human fingerprint, the data can be applied to dog face recognition, dog identification, etc
# Licensing Information
Commercial License
|
gguichard/wsd_fr_wngt_semcor_translated_aligned_v2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: tokens
sequence: string
- name: wn_sens
sequence: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 533318950.09168905
num_examples: 530592
- name: test
num_bytes: 2680706.9083109708
num_examples: 2667
download_size: 127035571
dataset_size: 535999657.0
---
# Dataset Card for "wsd_fr_wngt_semcor_translated_aligned_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SEACrowd/idn_tagged_corpus_csui | ---
tags:
- pos-tagging
language:
- ind
---
# idn_tagged_corpus_csui
Idn-tagged-corpus-CSUI is a POS tagging dataset contains about 10,000 sentences, collected from the PAN Localization Project tagged with 23 POS tag classes.
The POS tagset is created through a detailed study and analysis of existing tagsets and the manual tagging of an Indonesian corpus.
Idn-tagged-corpus-CSUI dataset is splitted into 3 sets with 8000 train, 1000 validation, 1029 test data.
## Dataset Usage
Run `pip install nusacrowd` before loading the dataset through HuggingFace's `load_dataset`.
## Citation
```
@inproceedings{dinakaramani2014designing,
title={Designing an Indonesian part of speech tagset and manually tagged Indonesian corpus},
author={Dinakaramani, Arawinda and Rashel, Fam and Luthfi, Andry and Manurung, Ruli},
booktitle={2014 International Conference on Asian Language Processing (IALP)},
pages={66--69},
year={2014},
organization={IEEE}
}
@inproceedings{kurniawan2018towards,
author={Kurniawan, Kemal and Aji, Alham Fikri},
booktitle={2018 International Conference on Asian Language Processing (IALP)},
title={Toward a Standardized and More Accurate Indonesian Part-of-Speech Tagging},
year={2018},
volume={},
number={},
pages={303-307},
doi={10.1109/IALP.2018.8629236}}
```
## License
Creative Commons Attribution Share-Alike 4.0 International
## Homepage
[https://bahasa.cs.ui.ac.id/postag/corpus](https://bahasa.cs.ui.ac.id/postag/corpus)
### NusaCatalogue
For easy indexing and metadata: [https://indonlp.github.io/nusa-catalogue](https://indonlp.github.io/nusa-catalogue) |
Nexdata/208914_Bounding_Boxes_Human_Body_Attributes_Data_in_Surveillance_Scenes | ---
license: cc-by-nc-nd-4.0
---
## Description
208,914 Bounding Boxes – Human Body Attributes Data in Surveillance Scenes. The data includes indoor (shopping mall) and outdoor (street, the gate of shopping mall and square) scenes. The data includes males and females and the age distribution is from children to the elderly. In this dataset, the rectangular bounding boxes and 19 attributes of human body were annotated. The data can be used for person attributes recognition.
For more details, please refer to the link: https://www.nexdata.ai/dataset/966?source=Huggingface
# Specifications
## Data size
206,290 human body bounding boxes, 2,624 appendage bounding boxes
## Environment
indoor (shopping mall) and outdoor (street, the gate of shopping mall and square) scenes
## Population
the race distribution is yellow race, the gender distribution is male and female, the age distribution is from children to the elderly
## Diversity
multiple age groups, multiple scenes, different poses
## Device
surveillance camera, the resolution is 1,920*1,080
## Photographic angles
looking down angle
## Format
.jpg, .json
## Annotation
the rectangular bounding boxes of human bodies, and 19 human body attributes
## Accuracy
annotation accuracy of bounding boxes is over 95%; annotation accuracy of attributes is over 95%
# Licensing Information
Commercial License
|
liuyanchen1015/MULTI_VALUE_wnli_for_to | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 5565
num_examples: 26
- name: test
num_bytes: 10673
num_examples: 38
- name: train
num_bytes: 41564
num_examples: 191
download_size: 28373
dataset_size: 57802
---
# Dataset Card for "MULTI_VALUE_wnli_for_to"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mhenrichsen/hestenettet | ---
dataset_info:
features:
- name: text
dtype: string
- name: source
dtype: string
- name: doc_id
dtype: string
- name: LICENSE
dtype: string
- name: uri
dtype: string
- name: date_built
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1227838360
num_examples: 14498
download_size: 747772002
dataset_size: 1227838360
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "hestenettet"
Subset of Gigaword.
https://huggingface.co/datasets/DDSC/partial-danish-gigaword-no-twitter |
open-llm-leaderboard/details_CallComply__Starling-LM-11B-alpha | ---
pretty_name: Evaluation run of CallComply/Starling-LM-11B-alpha
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CallComply/Starling-LM-11B-alpha](https://huggingface.co/CallComply/Starling-LM-11B-alpha)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CallComply__Starling-LM-11B-alpha\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-14T22:50:55.626486](https://huggingface.co/datasets/open-llm-leaderboard/details_CallComply__Starling-LM-11B-alpha/blob/main/results_2024-01-14T22-50-55.626486.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6124497978149351,\n\
\ \"acc_stderr\": 0.032857819921299845,\n \"acc_norm\": 0.618390298674969,\n\
\ \"acc_norm_stderr\": 0.03352975999467289,\n \"mc1\": 0.25703794369645044,\n\
\ \"mc1_stderr\": 0.01529807750948508,\n \"mc2\": 0.4153002055665266,\n\
\ \"mc2_stderr\": 0.014702058713161457\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5639931740614335,\n \"acc_stderr\": 0.014491225699230916,\n\
\ \"acc_norm\": 0.6126279863481229,\n \"acc_norm_stderr\": 0.01423587248790987\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6105357498506274,\n\
\ \"acc_stderr\": 0.0048663222583359665,\n \"acc_norm\": 0.8198566022704641,\n\
\ \"acc_norm_stderr\": 0.0038352143402103785\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800893,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800893\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.502127659574468,\n \"acc_stderr\": 0.032685726586674915,\n\
\ \"acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.032685726586674915\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\
\ \"acc_stderr\": 0.046306532033665956,\n \"acc_norm\": 0.41228070175438597,\n\
\ \"acc_norm_stderr\": 0.046306532033665956\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944444,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944444\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7580645161290323,\n \"acc_stderr\": 0.024362599693031086,\n \"\
acc_norm\": 0.7580645161290323,\n \"acc_norm_stderr\": 0.024362599693031086\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n \"\
acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593556,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593556\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.02432173848460235,\n \
\ \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.02432173848460235\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8146788990825689,\n \"acc_stderr\": 0.01665927970029582,\n \"\
acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.01665927970029582\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4583333333333333,\n \"acc_stderr\": 0.033981108902946366,\n \"\
acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.033981108902946366\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467617,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467617\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.03219079200419996,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.03219079200419996\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.038498560987940876,\n \"\
acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.038498560987940876\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8020434227330779,\n\
\ \"acc_stderr\": 0.014248873549217582,\n \"acc_norm\": 0.8020434227330779,\n\
\ \"acc_norm_stderr\": 0.014248873549217582\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n\
\ \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4245810055865922,\n\
\ \"acc_stderr\": 0.01653117099327889,\n \"acc_norm\": 0.4245810055865922,\n\
\ \"acc_norm_stderr\": 0.01653117099327889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.026643278474508755,\n\
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.026643278474508755\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n\
\ \"acc_stderr\": 0.02698147804364804,\n \"acc_norm\": 0.6559485530546624,\n\
\ \"acc_norm_stderr\": 0.02698147804364804\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.025329888171900926,\n\
\ \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.025329888171900926\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4397163120567376,\n \"acc_stderr\": 0.02960991207559411,\n \
\ \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.02960991207559411\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4348109517601043,\n\
\ \"acc_stderr\": 0.012661233805616295,\n \"acc_norm\": 0.4348109517601043,\n\
\ \"acc_norm_stderr\": 0.012661233805616295\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6066176470588235,\n \"acc_stderr\": 0.029674288281311155,\n\
\ \"acc_norm\": 0.6066176470588235,\n \"acc_norm_stderr\": 0.029674288281311155\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6356209150326797,\n \"acc_stderr\": 0.019469518221573705,\n \
\ \"acc_norm\": 0.6356209150326797,\n \"acc_norm_stderr\": 0.019469518221573705\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.02904308868330433,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.02904308868330433\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421606,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421606\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197768,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197768\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25703794369645044,\n\
\ \"mc1_stderr\": 0.01529807750948508,\n \"mc2\": 0.4153002055665266,\n\
\ \"mc2_stderr\": 0.014702058713161457\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7805840568271507,\n \"acc_stderr\": 0.011631268360607778\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.35178165276724793,\n \
\ \"acc_stderr\": 0.01315344602353602\n }\n}\n```"
repo_url: https://huggingface.co/CallComply/Starling-LM-11B-alpha
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|arc:challenge|25_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|gsm8k|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hellaswag|10_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T22-50-55.626486.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T22-50-55.626486.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- '**/details_harness|winogrande|5_2024-01-14T22-50-55.626486.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-14T22-50-55.626486.parquet'
- config_name: results
data_files:
- split: 2024_01_14T22_50_55.626486
path:
- results_2024-01-14T22-50-55.626486.parquet
- split: latest
path:
- results_2024-01-14T22-50-55.626486.parquet
---
# Dataset Card for Evaluation run of CallComply/Starling-LM-11B-alpha
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [CallComply/Starling-LM-11B-alpha](https://huggingface.co/CallComply/Starling-LM-11B-alpha) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CallComply__Starling-LM-11B-alpha",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T22:50:55.626486](https://huggingface.co/datasets/open-llm-leaderboard/details_CallComply__Starling-LM-11B-alpha/blob/main/results_2024-01-14T22-50-55.626486.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6124497978149351,
"acc_stderr": 0.032857819921299845,
"acc_norm": 0.618390298674969,
"acc_norm_stderr": 0.03352975999467289,
"mc1": 0.25703794369645044,
"mc1_stderr": 0.01529807750948508,
"mc2": 0.4153002055665266,
"mc2_stderr": 0.014702058713161457
},
"harness|arc:challenge|25": {
"acc": 0.5639931740614335,
"acc_stderr": 0.014491225699230916,
"acc_norm": 0.6126279863481229,
"acc_norm_stderr": 0.01423587248790987
},
"harness|hellaswag|10": {
"acc": 0.6105357498506274,
"acc_stderr": 0.0048663222583359665,
"acc_norm": 0.8198566022704641,
"acc_norm_stderr": 0.0038352143402103785
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800893,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800893
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.502127659574468,
"acc_stderr": 0.032685726586674915,
"acc_norm": 0.502127659574468,
"acc_norm_stderr": 0.032685726586674915
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.046306532033665956,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.046306532033665956
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944444,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944444
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7580645161290323,
"acc_stderr": 0.024362599693031086,
"acc_norm": 0.7580645161290323,
"acc_norm_stderr": 0.024362599693031086
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593556,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593556
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.02432173848460235,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.02432173848460235
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911499,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911499
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8146788990825689,
"acc_stderr": 0.01665927970029582,
"acc_norm": 0.8146788990825689,
"acc_norm_stderr": 0.01665927970029582
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.033981108902946366,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.033981108902946366
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467617,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.03219079200419996,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.03219079200419996
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.038498560987940876,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.038498560987940876
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04557239513497751,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04557239513497751
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8020434227330779,
"acc_stderr": 0.014248873549217582,
"acc_norm": 0.8020434227330779,
"acc_norm_stderr": 0.014248873549217582
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.02488314057007176,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.02488314057007176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4245810055865922,
"acc_stderr": 0.01653117099327889,
"acc_norm": 0.4245810055865922,
"acc_norm_stderr": 0.01653117099327889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.026643278474508755,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.026643278474508755
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6559485530546624,
"acc_stderr": 0.02698147804364804,
"acc_norm": 0.6559485530546624,
"acc_norm_stderr": 0.02698147804364804
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.025329888171900926,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.025329888171900926
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.02960991207559411,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.02960991207559411
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4348109517601043,
"acc_stderr": 0.012661233805616295,
"acc_norm": 0.4348109517601043,
"acc_norm_stderr": 0.012661233805616295
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6066176470588235,
"acc_stderr": 0.029674288281311155,
"acc_norm": 0.6066176470588235,
"acc_norm_stderr": 0.029674288281311155
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6356209150326797,
"acc_stderr": 0.019469518221573705,
"acc_norm": 0.6356209150326797,
"acc_norm_stderr": 0.019469518221573705
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.02904308868330433,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.02904308868330433
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421606,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421606
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197768,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197768
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25703794369645044,
"mc1_stderr": 0.01529807750948508,
"mc2": 0.4153002055665266,
"mc2_stderr": 0.014702058713161457
},
"harness|winogrande|5": {
"acc": 0.7805840568271507,
"acc_stderr": 0.011631268360607778
},
"harness|gsm8k|5": {
"acc": 0.35178165276724793,
"acc_stderr": 0.01315344602353602
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Rasi1610/Deathce502_series1 | ---
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 131079275.0
num_examples: 293
- name: val
num_bytes: 33147989.0
num_examples: 74
download_size: 163718866
dataset_size: 164227264.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
---
|
vietgpt-archive/xlsum_vi | ---
dataset_info:
features:
- name: gem_id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: target
dtype: string
- name: references
list: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 219542787
num_examples: 32108
- name: test
num_bytes: 15891883
num_examples: 4013
- name: validation
num_bytes: 15896905
num_examples: 4013
download_size: 134434688
dataset_size: 251331575
task_categories:
- summarization
language:
- vi
tags:
- LM
size_categories:
- 10K<n<100K
---
# xlsum
- Source: https://huggingface.co/datasets/GEM/xlsum
- Num examples:
- 32,108 (train)
- 4,013 (validation)
- 4,013 (test)
- Language: Vietnamese
```python
from datasets import load_dataset
load_dataset("tdtunlp/xlsum_vi")
```
- Format for Summarization task
```python
def preprocess(sample):
title = sample['title']
summary = sample['target']
article = sample['text']
return {'text': f'<|startoftext|><|title|>{title}<|article|>{article}<|summary|>{summary}<|endoftext|>'}
"""
<|startoftext|><|title|>Việt Nam đã sẵn sàng nâng tầm đối tác chiến lược với Mỹ?<|article|>Ông Donald Trump và ông Nguyễn Phú Trọng bắt tay trước thềm Thượng đỉnh Trump-Kim ở Hà Nội hôm 27/2/2019
Vài tháng qua đã có nhiều thảo luận về khả năng Hoa Kỳ-Việt Nam nâng tầm mối quan hệ từ "đối tác toàn diện" lên thành "đối tác chiến lược". Dưới đây là một số nhận định tiêu biểu.
Một số quan ngại
Theo ông Prashanth Parameswaran, tác giả bài viết hôm 12/9 trên The Diplomat, mối quan hệ Việt Nam - Hoa Kỳ đã tốt hơn nhiều so với thời chiến tranh Việt Nam. Hai nước bình thường hóa quan hệ dưới thời cựu Tổng thống Mỹ Bill Clinton và tiếp tục duy trì tốt dưới thời Obama. Việc nâng tầm quan hệ Mỹ-Việt có ý nghĩa lớn với các nhà hoạch định chính sách cả hai nước. Nó phản ánh nỗ lực của Washington trong việc mở rộng mạng lưới các đồng minh và đối tác tại châu Á - Thái Bình Dương và tầm quan trọng của Việt Nam trong mạng lưới này, đồng thời nhấn mạnh cơ hội và thách thức mà Hà Nội phải cân nhắc.
TQ 'không vui' với chuyến thăm VN của USS Carl Vinson?
David Hutt: 'Mục tiêu thương chiến kế tiếp của Trump là VN'
USS Carl Vinson tới Đà Nẵng: 'Bước đi chiến lược'
Việc Mỹ-Việt nâng tầm quan hệ có thể có ý nghĩa lớn hơn là bản thân mối quan hệ này, đặc biệt trong bối cảnh Mỹ - Trung Quốc tăng cường cạnh tranh về quyền lực trong khu vực châu Á-Thái Bình Dương và vai trò của Việt Nam trong các vấn đề như Biển Đông - nơi mà Trung Quốc ngày càng lấn lướt và Hà Nội chịu áp lực ngày càng lớn.
Mỹ gần đây đã tăng cường các hoạt động hợp tác với Việt Nam. Năm 2018, Mỹ mang hàng không mẫu hạm USS Carl Vinson tới Việt Nam. Năm nay, Chủ tịch nước, Tổng bí thư Nguyễn Phú Trọng cũng dự kiến có chuyến công du Mỹ vào tháng 10/2019. Tuy nhiên, thực tế là Việt Nam và Mỹ vẫn có nhiều khác biệt trong nhiều lĩnh vực, từ thể chế tới quan điểm về nhân quyền. Việt Nam và Mỹ cũng có khác biệt trong quan điểm đối với vấn đề thương mại hoặc vấn đề Bắc Hàn - điều khiến quan hệ hai nước từng có vẻ khó 'toàn diện', chứ chưa nói đến 'chiến lược'. Chính vì thế, các cuộc thảo luận để nâng tầm mối quan hệ Mỹ - Việt cũng bao gồm cả các quan ngại, ông Prashanth Parameswaran bình luận.
Các nhà hoạch định chính sách sẽ cần cân nhắc các yếu tố quan trọng này để tính toán được mất khi nâng tầm mối quan hệ. Chẳng hạn, không phải ngẫu nhiên mà chúng ta đã thấy Việt Nam trì hoãn một số hoạt động liên quan đến quốc phòng với Hoa Kỳ bất chấp những lợi ích có thể thấy rõ, vẫn theo tác giả Prashanth Parameswaran.
Các quan ngại nói trên không có nghĩa Việt Nam - Hoa Kỳ không mong muốn hoặc không thể nâng tầng hợp tác. Nhưng nó có nghĩa rằng cả Mỹ và Việt Nam cần đảm bảo rằng các vấn đề thực tế giữa hai nước phù hợp với bất cứ tầm mức quan hệ nào mà họ lựa chọn. Quan trọng nữa là, việc điều chỉnh tên gọi của mối quan hệ chỉ có giá trị khi cả hai bên cùng cam kết nỗ lực để biến tiềm năng hợp tác thành sự hợp tác trên thực tế.
Mỹ gửi tín hiệu 'hỗn hợp'
Nhà báo David Hutt, cũng về đề tài này, trên Asia Times lại cho rằng Mỹ gửi những tín hiệu không thống nhất đến Việt Nam, nói năm nay, Mỹ lên tiếng cáo buộc Trung Quốc có hành động 'bắt nạt' nước láng giềng Việt Nam. Mỹ cũng ngỏ ý "muốn củng cố mối quan hệ quân sự chặt chẽ hơn với Hà Nội, mặc dù Việt Nam vẫn tỏ ra thận trọng và vẫn duy trì các chính sách ngoại giao không cam kết,"
David Hutt cũng nhắc tới tin đồn gần đây rằng công ty dầu khí Mỹ ExxonMobil đang tìm cách rút dự án Cá Voi Xanh trị giá hàng tỷ đô la khỏi Việt Nam, và bình luận rằng: Nếu thực sự ExxonMobil rút - cứ cho là vì lý do tài chính chứ không phải địa chính trị - thì đây cũng là một cú nốc ao vào mối quan hệ Mỹ-Việt ở giai đoạn mang tính bước ngoặt. Hơn bao giờ hết, Hà Nội hiện đang tìm kiếm các cam kết từ Washington rằng họ sẽ đứng về phía mình trong bất kỳ cuộc xung đột có vũ trang nào với Trung Quốc trên Biển Đông.
Thương mại: Ông Donald Trump đe dọa Việt Nam
Kỳ vọng gì nếu chủ tịch Trọng thăm Hoa Kỳ?
Tập trận Mỹ-ASEAN: 'Mỹ sẽ không đứng yên nếu TQ tiếp tục ép VN'
Mỹ, tuy thế, đang gửi tín hiệu 'hỗn hợp'. Quan hệ Việt Nam - Hoa Kỳ nảy nở dưới thời Tổng thống Mỹ Donald Trump. Ông Trump đã hai lần đến thăm Việt Nam và hiếm khi chỉ trích điều gì về đất nước được coi là vi phạm nhân quyền tồi tệ nhất Đông Nam Á này, vẫn theo David Hutt. Nhưng ông Trump, bên cạnh đó, lại cũng rất phiền lòng với việc Việt Nam trở thành nơi sản xuất, xuất khẩu các mặt hàng Trung Quốc nằm trong diện bị Mỹ đánh thuế, để trốn thuế. Ông Trump, hồi tháng Sáu đã gọi Việt Nam là nước 'lạm dụng tồi tệ nhất' trong một cuộc phỏng vấn truyền hình.
Tuy nhiên, chính quyền của ông Trung cũng lại phản ứng quyết liệt khi Trung Quốc mang tàu vào khu đặc quyền kinh tế của Việt Nam tại Bãi Tư Chính trên Biển Đông. Người phát ngôn Bộ Ngoại giao Mỹ Morgan Ortagus nói Trung Quốc đã thực hiện một loạt các động thái hung hăng để can thiệp các hoạt động kinh tế lâu đời của Việt Nam.
"Việt-Mỹ đã hợp tác chiến lược nhiều mặt, trừ tên gọi"
Trong khi đó, tác giả Đoàn Xuân Lộc viết trên Asia Times, một yếu tố quan trọng của chính sách đối ngoại của Hà Nội là không liên minh. Để giúp đất nước tăng cường quan hệ ngoại giao, kinh tế và an ninh với các đối tác liên quan, chính phủ Việt Nam, do đó, đã tìm cách xây dựng một mạng lưới quan hệ đối tác. "Quan hệ đối tác toàn diện" là nấc thấp nhất trong mạng lưới này.
Việt Nam và Hoa Kỳ thiết lập "quan hệ đối tác toàn diện" tháng 7/2013. Như vậy, Việt Nam đứng sau Philippines, Thái Lan, Indonesia và Singapore - các đối tác chiến lược của Hoa Kỳ trong khu vực - về tầm quan trọng đối với Washington.
Trong khi đó, Việt Nam đã nâng tầm "quan hệ chiến lược" với 16 nước gồm Nga (2001), Nhật Bản (2006), Ấn Độ (2007), Trung Quốc (2008), Hàn Quốc và Tây Ban Nha (2009), Vương quốc Anh (2010), Đức (2011), Pháp, Indonesia, Ý, Singapore và Thái Lan ( 2013), Malaysia và Philippines (2015) và Úc (2017).
Trong ngôn ngữ ngoại giao của Hà Nội, tất nhiên, Trung Quốc là đối tác quan trọng nhất của Việt Nam, trong khi Mỹ là một trong những quốc gia ít quan trọng nhất. Trên giấy tờ, mối "quan hệ đối tác toàn diện" của Việt Nam với Mỹ - nền kinh tế và quân sự lớn nhất thế giới - thậm chí còn xếp sau quan hệ "đối tác toàn diện" của Việt Nam với Myanmar - được thiết lập năm 2017.
Nhưng trên thực tế, Mỹ là đối tác quan trọng thứ hai của Việt Nam. Ở nhiều khía cạnh, Mỹ cũng quan trọng không kém Trung Quốc. Và Hà Nội hiểu rằng có một mối quan hệ khỏe mạnh với Mỹ mang tính sống còn với đất nước, giúp ổn định sự phát triển và tránh quá phụ thuộc vào Trung Quốc về kinh tế, ông Đoàn Xuân Lộc nhận định.
Hiện nay, sự hung hăng của Trung Quốc ở Biển Đông là một trong các yếu tố chính để Việt Nam tìm cách thắt chặt quan hệ với Mỹ, đặc biệt trong an ninh quốc phòng.
Nhìn chung, mặc dù vẫn có những khác biệt nhất định, đặc biệt là về các quyền tự do chính trị và nhân quyền, lợi ích chiến lược của Hoa Kỳ và Việt Nam ngày càng phù hợp với nhau. Đối Việt Nam, mối quan hệ với Mỹ hiện tại về cơ bản là chiến lược trong nhiều lĩnh vực quan trọng, như an ninh và quốc phòng, mặc dù về tên gọi nó mới chỉ là "quan hệ đối tác toàn diện", vẫn theo tác giả Đoàn Xuân Lộc.
<|summary|>Ý kiến về khả năng Việt-Mỹ trở thành đối tác chiến lược khi hai nước có nhiều khác biệt về thể chế chính trị và nhân quyền.<|endoftext|>
"""
``` |
higgsfield/school-math-questions | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
splits:
- name: train
num_bytes: 4787332
num_examples: 8792
download_size: 2576099
dataset_size: 4787332
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "school-math-questions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
modelloosrvcc/Sam | ---
license: openrail
---
|
md_gender_bias | ---
annotations_creators:
- crowdsourced
- found
- machine-generated
language_creators:
- crowdsourced
- found
language:
- en
license:
- mit
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
- 10K<n<100K
- 1K<n<10K
- 1M<n<10M
- n<1K
source_datasets:
- extended|other-convai2
- extended|other-light
- extended|other-opensubtitles
- extended|other-yelp
- original
task_categories:
- text-classification
task_ids: []
paperswithcode_id: md-gender
pretty_name: Multi-Dimensional Gender Bias Classification
tags:
- gender-bias
dataset_info:
- config_name: gendered_words
features:
- name: word_masculine
dtype: string
- name: word_feminine
dtype: string
splits:
- name: train
num_bytes: 4988
num_examples: 222
download_size: 232629010
dataset_size: 4988
- config_name: name_genders
features:
- name: name
dtype: string
- name: assigned_gender
dtype:
class_label:
names:
'0': M
'1': F
- name: count
dtype: int32
splits:
- name: yob1880
num_bytes: 43404
num_examples: 2000
- name: yob1881
num_bytes: 41944
num_examples: 1935
- name: yob1882
num_bytes: 46211
num_examples: 2127
- name: yob1883
num_bytes: 45221
num_examples: 2084
- name: yob1884
num_bytes: 49886
num_examples: 2297
- name: yob1885
num_bytes: 49810
num_examples: 2294
- name: yob1886
num_bytes: 51935
num_examples: 2392
- name: yob1887
num_bytes: 51458
num_examples: 2373
- name: yob1888
num_bytes: 57531
num_examples: 2651
- name: yob1889
num_bytes: 56177
num_examples: 2590
- name: yob1890
num_bytes: 58509
num_examples: 2695
- name: yob1891
num_bytes: 57767
num_examples: 2660
- name: yob1892
num_bytes: 63493
num_examples: 2921
- name: yob1893
num_bytes: 61525
num_examples: 2831
- name: yob1894
num_bytes: 63927
num_examples: 2941
- name: yob1895
num_bytes: 66346
num_examples: 3049
- name: yob1896
num_bytes: 67224
num_examples: 3091
- name: yob1897
num_bytes: 65886
num_examples: 3028
- name: yob1898
num_bytes: 71088
num_examples: 3264
- name: yob1899
num_bytes: 66225
num_examples: 3042
- name: yob1900
num_bytes: 81305
num_examples: 3730
- name: yob1901
num_bytes: 68723
num_examples: 3153
- name: yob1902
num_bytes: 73321
num_examples: 3362
- name: yob1903
num_bytes: 74019
num_examples: 3389
- name: yob1904
num_bytes: 77751
num_examples: 3560
- name: yob1905
num_bytes: 79802
num_examples: 3655
- name: yob1906
num_bytes: 79392
num_examples: 3633
- name: yob1907
num_bytes: 86342
num_examples: 3948
- name: yob1908
num_bytes: 87965
num_examples: 4018
- name: yob1909
num_bytes: 92591
num_examples: 4227
- name: yob1910
num_bytes: 101491
num_examples: 4629
- name: yob1911
num_bytes: 106787
num_examples: 4867
- name: yob1912
num_bytes: 139448
num_examples: 6351
- name: yob1913
num_bytes: 153110
num_examples: 6968
- name: yob1914
num_bytes: 175167
num_examples: 7965
- name: yob1915
num_bytes: 205921
num_examples: 9357
- name: yob1916
num_bytes: 213468
num_examples: 9696
- name: yob1917
num_bytes: 218446
num_examples: 9913
- name: yob1918
num_bytes: 229209
num_examples: 10398
- name: yob1919
num_bytes: 228656
num_examples: 10369
- name: yob1920
num_bytes: 237286
num_examples: 10756
- name: yob1921
num_bytes: 239616
num_examples: 10857
- name: yob1922
num_bytes: 237569
num_examples: 10756
- name: yob1923
num_bytes: 235046
num_examples: 10643
- name: yob1924
num_bytes: 240113
num_examples: 10869
- name: yob1925
num_bytes: 235098
num_examples: 10638
- name: yob1926
num_bytes: 230970
num_examples: 10458
- name: yob1927
num_bytes: 230004
num_examples: 10406
- name: yob1928
num_bytes: 224583
num_examples: 10159
- name: yob1929
num_bytes: 217057
num_examples: 9820
- name: yob1930
num_bytes: 216352
num_examples: 9791
- name: yob1931
num_bytes: 205361
num_examples: 9298
- name: yob1932
num_bytes: 207268
num_examples: 9381
- name: yob1933
num_bytes: 199031
num_examples: 9013
- name: yob1934
num_bytes: 202758
num_examples: 9180
- name: yob1935
num_bytes: 199614
num_examples: 9037
- name: yob1936
num_bytes: 196379
num_examples: 8894
- name: yob1937
num_bytes: 197757
num_examples: 8946
- name: yob1938
num_bytes: 199603
num_examples: 9032
- name: yob1939
num_bytes: 196979
num_examples: 8918
- name: yob1940
num_bytes: 198141
num_examples: 8961
- name: yob1941
num_bytes: 200858
num_examples: 9085
- name: yob1942
num_bytes: 208363
num_examples: 9425
- name: yob1943
num_bytes: 207940
num_examples: 9408
- name: yob1944
num_bytes: 202227
num_examples: 9152
- name: yob1945
num_bytes: 199478
num_examples: 9025
- name: yob1946
num_bytes: 214614
num_examples: 9705
- name: yob1947
num_bytes: 229327
num_examples: 10371
- name: yob1948
num_bytes: 226615
num_examples: 10241
- name: yob1949
num_bytes: 227278
num_examples: 10269
- name: yob1950
num_bytes: 227946
num_examples: 10303
- name: yob1951
num_bytes: 231613
num_examples: 10462
- name: yob1952
num_bytes: 235483
num_examples: 10646
- name: yob1953
num_bytes: 239654
num_examples: 10837
- name: yob1954
num_bytes: 242389
num_examples: 10968
- name: yob1955
num_bytes: 245652
num_examples: 11115
- name: yob1956
num_bytes: 250674
num_examples: 11340
- name: yob1957
num_bytes: 255370
num_examples: 11564
- name: yob1958
num_bytes: 254520
num_examples: 11522
- name: yob1959
num_bytes: 260051
num_examples: 11767
- name: yob1960
num_bytes: 263474
num_examples: 11921
- name: yob1961
num_bytes: 269493
num_examples: 12182
- name: yob1962
num_bytes: 270244
num_examples: 12209
- name: yob1963
num_bytes: 271872
num_examples: 12282
- name: yob1964
num_bytes: 274590
num_examples: 12397
- name: yob1965
num_bytes: 264889
num_examples: 11952
- name: yob1966
num_bytes: 269321
num_examples: 12151
- name: yob1967
num_bytes: 274867
num_examples: 12397
- name: yob1968
num_bytes: 286774
num_examples: 12936
- name: yob1969
num_bytes: 304909
num_examples: 13749
- name: yob1970
num_bytes: 328047
num_examples: 14779
- name: yob1971
num_bytes: 339657
num_examples: 15295
- name: yob1972
num_bytes: 342321
num_examples: 15412
- name: yob1973
num_bytes: 348414
num_examples: 15682
- name: yob1974
num_bytes: 361188
num_examples: 16249
- name: yob1975
num_bytes: 376491
num_examples: 16944
- name: yob1976
num_bytes: 386565
num_examples: 17391
- name: yob1977
num_bytes: 403994
num_examples: 18175
- name: yob1978
num_bytes: 405430
num_examples: 18231
- name: yob1979
num_bytes: 423423
num_examples: 19039
- name: yob1980
num_bytes: 432317
num_examples: 19452
- name: yob1981
num_bytes: 432980
num_examples: 19475
- name: yob1982
num_bytes: 437986
num_examples: 19694
- name: yob1983
num_bytes: 431531
num_examples: 19407
- name: yob1984
num_bytes: 434085
num_examples: 19506
- name: yob1985
num_bytes: 447113
num_examples: 20085
- name: yob1986
num_bytes: 460315
num_examples: 20657
- name: yob1987
num_bytes: 477677
num_examples: 21406
- name: yob1988
num_bytes: 499347
num_examples: 22367
- name: yob1989
num_bytes: 531020
num_examples: 23775
- name: yob1990
num_bytes: 552114
num_examples: 24716
- name: yob1991
num_bytes: 560932
num_examples: 25109
- name: yob1992
num_bytes: 568151
num_examples: 25427
- name: yob1993
num_bytes: 579778
num_examples: 25966
- name: yob1994
num_bytes: 580223
num_examples: 25997
- name: yob1995
num_bytes: 581949
num_examples: 26080
- name: yob1996
num_bytes: 589131
num_examples: 26423
- name: yob1997
num_bytes: 601284
num_examples: 26970
- name: yob1998
num_bytes: 621587
num_examples: 27902
- name: yob1999
num_bytes: 635355
num_examples: 28552
- name: yob2000
num_bytes: 662398
num_examples: 29772
- name: yob2001
num_bytes: 673111
num_examples: 30274
- name: yob2002
num_bytes: 679392
num_examples: 30564
- name: yob2003
num_bytes: 692931
num_examples: 31185
- name: yob2004
num_bytes: 711776
num_examples: 32048
- name: yob2005
num_bytes: 723065
num_examples: 32549
- name: yob2006
num_bytes: 757620
num_examples: 34088
- name: yob2007
num_bytes: 776893
num_examples: 34961
- name: yob2008
num_bytes: 779403
num_examples: 35079
- name: yob2009
num_bytes: 771032
num_examples: 34709
- name: yob2010
num_bytes: 756717
num_examples: 34073
- name: yob2011
num_bytes: 752804
num_examples: 33908
- name: yob2012
num_bytes: 748915
num_examples: 33747
- name: yob2013
num_bytes: 738288
num_examples: 33282
- name: yob2014
num_bytes: 737219
num_examples: 33243
- name: yob2015
num_bytes: 734183
num_examples: 33121
- name: yob2016
num_bytes: 731291
num_examples: 33010
- name: yob2017
num_bytes: 721444
num_examples: 32590
- name: yob2018
num_bytes: 708657
num_examples: 32033
download_size: 232629010
dataset_size: 43393095
- config_name: new_data
features:
- name: text
dtype: string
- name: original
dtype: string
- name: labels
list:
class_label:
names:
'0': ABOUT:female
'1': ABOUT:male
'2': PARTNER:female
'3': PARTNER:male
'4': SELF:female
'5': SELF:male
- name: class_type
dtype:
class_label:
names:
'0': about
'1': partner
'2': self
- name: turker_gender
dtype:
class_label:
names:
'0': man
'1': woman
'2': nonbinary
'3': prefer not to say
'4': no answer
- name: episode_done
dtype: bool_
- name: confidence
dtype: string
splits:
- name: train
num_bytes: 369753
num_examples: 2345
download_size: 232629010
dataset_size: 369753
- config_name: funpedia
features:
- name: text
dtype: string
- name: title
dtype: string
- name: persona
dtype: string
- name: gender
dtype:
class_label:
names:
'0': gender-neutral
'1': female
'2': male
splits:
- name: train
num_bytes: 3225542
num_examples: 23897
- name: validation
num_bytes: 402205
num_examples: 2984
- name: test
num_bytes: 396417
num_examples: 2938
download_size: 232629010
dataset_size: 4024164
- config_name: image_chat
features:
- name: caption
dtype: string
- name: id
dtype: string
- name: male
dtype: bool_
- name: female
dtype: bool_
splits:
- name: train
num_bytes: 1061285
num_examples: 9997
- name: validation
num_bytes: 35868670
num_examples: 338180
- name: test
num_bytes: 530126
num_examples: 5000
download_size: 232629010
dataset_size: 37460081
- config_name: wizard
features:
- name: text
dtype: string
- name: chosen_topic
dtype: string
- name: gender
dtype:
class_label:
names:
'0': gender-neutral
'1': female
'2': male
splits:
- name: train
num_bytes: 1158785
num_examples: 10449
- name: validation
num_bytes: 57824
num_examples: 537
- name: test
num_bytes: 53126
num_examples: 470
download_size: 232629010
dataset_size: 1269735
- config_name: convai2_inferred
features:
- name: text
dtype: string
- name: binary_label
dtype:
class_label:
names:
'0': ABOUT:female
'1': ABOUT:male
- name: binary_score
dtype: float32
- name: ternary_label
dtype:
class_label:
names:
'0': ABOUT:female
'1': ABOUT:male
'2': ABOUT:gender-neutral
- name: ternary_score
dtype: float32
splits:
- name: train
num_bytes: 9853669
num_examples: 131438
- name: validation
num_bytes: 608046
num_examples: 7801
- name: test
num_bytes: 608046
num_examples: 7801
download_size: 232629010
dataset_size: 11069761
- config_name: light_inferred
features:
- name: text
dtype: string
- name: binary_label
dtype:
class_label:
names:
'0': ABOUT:female
'1': ABOUT:male
- name: binary_score
dtype: float32
- name: ternary_label
dtype:
class_label:
names:
'0': ABOUT:female
'1': ABOUT:male
'2': ABOUT:gender-neutral
- name: ternary_score
dtype: float32
splits:
- name: train
num_bytes: 10931355
num_examples: 106122
- name: validation
num_bytes: 679692
num_examples: 6362
- name: test
num_bytes: 1375745
num_examples: 12765
download_size: 232629010
dataset_size: 12986792
- config_name: opensubtitles_inferred
features:
- name: text
dtype: string
- name: binary_label
dtype:
class_label:
names:
'0': ABOUT:female
'1': ABOUT:male
- name: binary_score
dtype: float32
- name: ternary_label
dtype:
class_label:
names:
'0': ABOUT:female
'1': ABOUT:male
'2': ABOUT:gender-neutral
- name: ternary_score
dtype: float32
splits:
- name: train
num_bytes: 27966476
num_examples: 351036
- name: validation
num_bytes: 3363802
num_examples: 41957
- name: test
num_bytes: 3830528
num_examples: 49108
download_size: 232629010
dataset_size: 35160806
- config_name: yelp_inferred
features:
- name: text
dtype: string
- name: binary_label
dtype:
class_label:
names:
'0': ABOUT:female
'1': ABOUT:male
- name: binary_score
dtype: float32
splits:
- name: train
num_bytes: 260582945
num_examples: 2577862
- name: validation
num_bytes: 324349
num_examples: 4492
- name: test
num_bytes: 53887700
num_examples: 534460
download_size: 232629010
dataset_size: 314794994
config_names:
- convai2_inferred
- funpedia
- gendered_words
- image_chat
- light_inferred
- name_genders
- new_data
- opensubtitles_inferred
- wizard
- yelp_inferred
---
# Dataset Card for Multi-Dimensional Gender Bias Classification
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [ParlAI MD Gender Project Page](https://parl.ai/projects/md_gender/)
- **Repository:** [ParlAI Github MD Gender Repository](https://github.com/facebookresearch/ParlAI/tree/master/projects/md_gender)
- **Paper:** [Multi-Dimensional Gender Bias Classification](https://www.aclweb.org/anthology/2020.emnlp-main.23.pdf)
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** edinan@fb.com
### Dataset Summary
The Multi-Dimensional Gender Bias Classification dataset is based on a general framework that decomposes gender bias in text along several pragmatic and semantic dimensions: bias from the gender of the person being spoken about, bias from the gender of the person being spoken to, and bias from the gender of the speaker. It contains seven large scale datasets automatically annotated for gender information (there are eight in the original project but the Wikipedia set is not included in the HuggingFace distribution), one crowdsourced evaluation benchmark of utterance-level gender rewrites, a list of gendered names, and a list of gendered words in English.
### Supported Tasks and Leaderboards
- `text-classification-other-gender-bias`: The dataset can be used to train a model for classification of various kinds of gender bias. The model performance is evaluated based on the accuracy of the predicted labels as compared to the given labels in the dataset. Dinan et al's (2020) Transformer model achieved an average of 67.13% accuracy in binary gender prediction across the ABOUT, TO, and AS tasks. See the paper for more results.
### Languages
The data is in English as spoken on the various sites where the data was collected. The associated BCP-47 code `en`.
## Dataset Structure
### Data Instances
The following are examples of data instances from the various configs in the dataset. See the [MD Gender Bias dataset viewer](https://huggingface.co/datasets/viewer/?dataset=md_gender_bias) to explore more examples.
An example from the `new_data` config:
```
{'class_type': 0,
'confidence': 'certain',
'episode_done': True,
'labels': [1],
'original': 'She designed monumental Loviisa war cemetery in 1920',
'text': 'He designed monumental Lovissa War Cemetery in 1920.',
'turker_gender': 4}
```
An example from the `funpedia` config:
```
{'gender': 2,
'persona': 'Humorous',
'text': 'Max Landis is a comic book writer who wrote Chronicle, American Ultra, and Victor Frankestein.',
'title': 'Max Landis'}
```
An example from the `image_chat` config:
```
{'caption': '<start> a young girl is holding a pink umbrella in her hand <eos>',
'female': True,
'id': '2923e28b6f588aff2d469ab2cccfac57',
'male': False}
```
An example from the `wizard` config:
```
{'chosen_topic': 'Krav Maga',
'gender': 2,
'text': 'Hello. I hope you might enjoy or know something about Krav Maga?'}
```
An example from the `convai2_inferred` config (the other `_inferred` configs have the same fields, with the exception of `yelp_inferred`, which does not have the `ternary_label` or `ternary_score` fields):
```
{'binary_label': 1,
'binary_score': 0.6521999835968018,
'ternary_label': 2,
'ternary_score': 0.4496000111103058,
'text': "hi , how are you doing ? i'm getting ready to do some cheetah chasing to stay in shape ."}
```
An example from the `gendered_words` config:
```
{'word_feminine': 'countrywoman',
'word_masculine': 'countryman'}
```
An example from the `name_genders` config:
```
{'assigned_gender': 1,
'count': 7065,
'name': 'Mary'}
```
### Data Fields
The following are the features for each of the configs.
For the `new_data` config:
- `text`: the text to be classified
- `original`: the text before reformulation
- `labels`: a `list` of classification labels, with possible values including `ABOUT:female`, `ABOUT:male`, `PARTNER:female`, `PARTNER:male`, `SELF:female`.
- `class_type`: a classification label, with possible values including `about` (0), `partner` (1), `self` (2).
- `turker_gender`: a classification label, with possible values including `man` (0), `woman` (1), `nonbinary` (2), `prefer not to say` (3), `no answer` (4).
- `episode_done`: a boolean indicating whether the conversation was completed.
- `confidence`: a string indicating the confidence of the annotator in response to the instance label being ABOUT/TO/AS a man or woman. Possible values are `certain`, `pretty sure`, and `unsure`.
For the `funpedia` config:
- `text`: the text to be classified.
- `gender`: a classification label, with possible values including `gender-neutral` (0), `female` (1), `male` (2), indicating the gender of the person being talked about.
- `persona`: a string describing the persona assigned to the user when talking about the entity.
- `title`: a string naming the entity the text is about.
For the `image_chat` config:
- `caption`: a string description of the contents of the original image.
- `female`: a boolean indicating whether the gender of the person being talked about is female, if the image contains a person.
- `id`: a string indicating the id of the image.
- `male`: a boolean indicating whether the gender of the person being talked about is male, if the image contains a person.
For the `wizard` config:
- `text`: the text to be classified.
- `chosen_topic`: a string indicating the topic of the text.
- `gender`: a classification label, with possible values including `gender-neutral` (0), `female` (1), `male` (2), indicating the gender of the person being talked about.
For the `_inferred` configurations (again, except the `yelp_inferred` split, which does not have the `ternary_label` or `ternary_score` fields):
- `text`: the text to be classified.
- `binary_label`: a classification label, with possible values including `ABOUT:female`, `ABOUT:male`.
- `binary_score`: a float indicating a score between 0 and 1.
- `ternary_label`: a classification label, with possible values including `ABOUT:female`, `ABOUT:male`, `ABOUT:gender-neutral`.
- `ternary_score`: a float indicating a score between 0 and 1.
For the word list:
- `word_masculine`: a string indicating the masculine version of the word.
- `word_feminine`: a string indicating the feminine version of the word.
For the gendered name list:
- `assigned_gender`: an integer, 1 for female, 0 for male.
- `count`: an integer.
- `name`: a string of the name.
### Data Splits
The different parts of the data can be accessed through the different configurations:
- `gendered_words`: A list of common nouns with a masculine and feminine variant.
- `new_data`: Sentences reformulated and annotated along all three axes.
- `funpedia`, `wizard`: Sentences from Funpedia and Wizards of Wikipedia annotated with ABOUT gender with entity gender information.
- `image_chat`: sentences about images annotated with ABOUT gender based on gender information from the entities in the image
- `convai2_inferred`, `light_inferred`, `opensubtitles_inferred`, `yelp_inferred`: Data from several source datasets with ABOUT annotations inferred by a trined classifier.
| Split | M | F | N | U | Dimension |
| ---------- | ---- | --- | ---- | ---- | --------- |
| Image Chat | 39K | 15K | 154K | - | ABOUT |
| Funpedia | 19K | 3K | 1K | - | ABOUT |
| Wizard | 6K | 1K | 1K | - | ABOUT |
| Yelp | 1M | 1M | - | - | AS |
| ConvAI2 | 22K | 22K | - | 86K | AS |
| ConvAI2 | 22K | 22K | - | 86K | TO |
| OpenSub | 149K | 69K | - | 131K | AS |
| OpenSub | 95K | 45K | - | 209K | TO |
| LIGHT | 13K | 8K | - | 83K | AS |
| LIGHT | 13K | 8K | - | 83K | TO |
| ---------- | ---- | --- | ---- | ---- | --------- |
| MDGender | 384 | 401 | - | - | ABOUT |
| MDGender | 396 | 371 | - | - | AS |
| MDGender | 411 | 382 | - | - | TO |
## Dataset Creation
### Curation Rationale
The curators chose to annotate the existing corpora to make their classifiers reliable on all dimensions (ABOUT/TO/AS) and across multiple domains. However, none of the existing datasets cover all three dimensions at the same time, and many of the gender labels are noisy. To enable reliable evaluation, the curators collected a specialized corpus, found in the `new_data` config, which acts as a gold-labeled dataset for the masculine and feminine classes.
### Source Data
#### Initial Data Collection and Normalization
For the `new_data` config, the curators collected conversations between two speakers. Each speaker was provided with a persona description containing gender information, then tasked with adopting that persona and having a conversation. They were also provided with small sections of a biography from Wikipedia as the conversation topic in order to encourage crowdworkers to discuss ABOUT/TO/AS gender information. To ensure there is ABOUT/TO/AS gender information contained in each utterance, the curators asked a second set of annotators to rewrite each utterance to make it very clear that they are speaking ABOUT a man or a woman, speaking AS a man or a woman, and speaking TO a man or a woman.
#### Who are the source language producers?
This dataset was collected from crowdworkers from Amazon’s Mechanical Turk. All workers are English-speaking and located in the United States.
| Reported Gender | Percent of Total |
| ----------------- | ---------------- |
| Man | 67.38 |
| Woman | 18.34 |
| Non-binary | 0.21 |
| Prefer not to say | 14.07 |
### Annotations
#### Annotation process
For the `new_data` config, annotators were asked to label how confident they are that someone else could predict the given gender label, allowing for flexibility between explicit genderedness (like the use of "he" or "she") and statistical genderedness.
Many of the annotated datasets contain cases where the ABOUT, AS, TO labels are not provided (i.e. unknown). In such instances, the curators apply one of two strategies. They apply the imputation strategy for data for which the ABOUT label is unknown using a classifier trained only on other Wikipedia data for which this label is provided. Data without a TO or AS label was assigned one at random, choosing between masculine and feminine with equal probability. Details of how each of the eight training datasets was annotated are as follows:
1. Wikipedia- to annotate ABOUT, the curators used a Wikipedia dump and extract biography pages using named entity recognition. They labeled pages with a gender based on the number of gendered pronouns (he vs. she vs. they) and labeled each paragraph in the page with this label for the ABOUT dimension.
2. Funpedia- Funpedia ([Miller et al., 2017](https://www.aclweb.org/anthology/D17-2014/)) contains rephrased Wikipedia sentences in a more conversational way. The curators retained only biography related sentences and annotate similar to Wikipedia, to give ABOUT labels.
3. Wizard of Wikipedia- [Wizard of Wikipedia](https://parl.ai/projects/wizard_of_wikipedia/) contains two people discussing a topic in Wikipedia. The curators retain only the conversations on Wikipedia biographies and annotate to create ABOUT labels.
4. ImageChat- [ImageChat](https://klshuster.github.io/image_chat/) contains conversations discussing the contents of an image. The curators used the [Xu et al. image captioning system](https://github.com/AaronCCWong/Show-Attend-and-Tell) to identify the contents of an image and select gendered examples.
5. Yelp- The curators used the Yelp reviewer gender predictor developed by ([Subramanian et al., 2018](https://arxiv.org/pdf/1811.00552.pdf)) and retain reviews for which the classifier is very confident – this creates labels for the content creator of the review (AS). They impute ABOUT labels on this dataset using a classifier trained on the datasets 1-4.
6. ConvAI2- [ConvAI2](https://parl.ai/projects/convai2/) contains persona-based conversations. Many personas contain sentences such as 'I am a old woman' or 'My name is Bob' which allows annotators to annotate the gender of the speaker (AS) and addressee (TO) with some confidence. Many of the personas have unknown gender. The curators impute ABOUT labels on this dataset using a classifier trained on the datasets 1-4.
7. OpenSubtitles- [OpenSubtitles](http://www.opensubtitles.org/) contains subtitles for movies in different languages. The curators retained English subtitles that contain a character name or identity. They annotated the character’s gender using gender kinship terms such as daughter and gender probability distribution calculated by counting the masculine and feminine names of baby names in the United States. Using the character’s gender, they produced labels for the AS dimension. They produced labels for the TO dimension by taking the gender of the next character to speak if there is another utterance in the conversation; otherwise, they take the gender of the last character to speak. They impute ABOUT labels on this dataset using a classifier trained on the datasets 1-4.
8. LIGHT- [LIGHT](https://parl.ai/projects/light/) contains persona-based conversation. Similarly to ConvAI2, annotators labeled the gender of each persona, giving labels for the speaker (AS) and speaking partner (TO). The curators impute ABOUT labels on this dataset using a classifier trained on the datasets 1-4.
#### Who are the annotators?
This dataset was annotated by crowdworkers from Amazon’s Mechanical Turk. All workers are English-speaking and located in the United States.
### Personal and Sensitive Information
For privacy reasons the curators did not associate the self-reported gender of the annotator with the labeled examples in the dataset and only report these statistics in aggregate.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is intended for applications such as controlling for gender bias in generative models, detecting gender bias in arbitrary text, and classifying text as offensive based on its genderedness.
### Discussion of Biases
Over two thirds of annotators identified as men, which may introduce biases into the dataset.
Wikipedia is also well known to have gender bias in equity of biographical coverage and lexical bias in noun references to women (see the paper's appendix for citations).
### Other Known Limitations
The limitations of the Multi-Dimensional Gender Bias Classification dataset have not yet been investigated, but the curators acknowledge that more work is required to address the intersectionality of gender identities, i.e., when gender non-additively interacts with other identity characteristics. The curators point out that negative gender stereotyping is known to be alternatively weakened or reinforced by the presence of social attributes like dialect, class and race and that these differences have been found to affect gender classification in images and sentences encoders. See the paper for references.
## Additional Information
### Dataset Curators
Emily Dinan, Angela Fan, Ledell Wu, Jason Weston, Douwe Kiela, and Adina Williams at Facebook AI Research. Angela Fan is also affiliated with Laboratoire Lorrain d’Informatique et Applications (LORIA).
### Licensing Information
The Multi-Dimensional Gender Bias Classification dataset is licensed under the [MIT License](https://opensource.org/licenses/MIT).
### Citation Information
```
@inproceedings{dinan-etal-2020-multi,
title = "Multi-Dimensional Gender Bias Classification",
author = "Dinan, Emily and
Fan, Angela and
Wu, Ledell and
Weston, Jason and
Kiela, Douwe and
Williams, Adina",
booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)",
month = nov,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.emnlp-main.23",
doi = "10.18653/v1/2020.emnlp-main.23",
pages = "314--331",
abstract = "Machine learning models are trained to find patterns in data. NLP models can inadvertently learn socially undesirable patterns when training on gender biased text. In this work, we propose a novel, general framework that decomposes gender bias in text along several pragmatic and semantic dimensions: bias from the gender of the person being spoken about, bias from the gender of the person being spoken to, and bias from the gender of the speaker. Using this fine-grained framework, we automatically annotate eight large scale datasets with gender information. In addition, we collect a new, crowdsourced evaluation benchmark. Distinguishing between gender bias along multiple dimensions enables us to train better and more fine-grained gender bias classifiers. We show our classifiers are valuable for a variety of applications, like controlling for gender bias in generative models, detecting gender bias in arbitrary text, and classifying text as offensive based on its genderedness.",
}
```
### Contributions
Thanks to [@yjernite](https://github.com/yjernite) and [@mcmillanmajora](https://github.com/mcmillanmajora)for adding this dataset. |
jorgeortizfuentes/sfl_automatization_spanish_nominal_groups | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: ng_tags
sequence:
class_label:
names:
'0': B-NG
'1': I-NG
'2': O
splits:
- name: train
num_bytes: 1363436.5003837298
num_examples: 1824
- name: test
num_bytes: 292271.7498081351
num_examples: 391
- name: validation
num_bytes: 292271.7498081351
num_examples: 391
download_size: 520154
dataset_size: 1947980.0
---
# Dataset Card for "sfl_automatization_spanish_nominal_groups"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wbbbbb/pclue | ---
license: apache-2.0
task_categories:
- text-generation
language:
- zh
---
# pCLUE
pCLUE: Large-scale Prompt-based Dataset for Multi-task and Zero-shot Learning in Chinese
pCLUE:基于提示的大规模预训练数据集,用于多任务学习和零样本学习
### 已转化数据集
数据量: 120万训练数据,73个Prompt
1. 训练集 train.json: 1,200,705
2. 验证集 dev.json: 100,000
3. 公开测试集 test_public.json: 129,556
4. 测试集 test.json: 250,461
具体数据,见:./datasets
### 目前已经有包含9个数据集:
1.单分类tnews
2.单分类iflytek
3.自然语言推理ocnli
4.语义匹配afqmc
5.指代消解-cluewsc2020
6.关键词识别-csl
7.阅读理解-自由式c3
8.阅读理解-抽取式cmrc2018
9.阅读理解-成语填空chid
### 字段说明及评价标准:
input:模型的输入
target:模型的输出
type:任务类型,阅读理解(mrc),分类(classify),生成(generate),自然语言推理(nli)
评价标准:阅读理解(em),分类(acc),生成(em),自然语言推理(acc)
answer_choices:选项(只有分类、推理类任务有)
### 提交样例:
见resources/promptclue_submit_examples。只需提交一个文件,每行是一个json,如:{"target": "2000万元"}
### 示例:
{"input": "哪个类别最好的描述了这篇新闻?扣篮王拉文:精彩暴扣表演!炸\n选项:故事,文化,娱乐,体育,财经,房产,汽车,教育,科技,军事,旅游,国际,股票,农业,游戏\n答案:", "target": "电竞", "answer_choices": ["故事", "文化", "娱乐", "体育", "财经", "房产", "汽车", "教育", "科技", "军事", "旅游", "国际", "股票", "农业", "游戏"], "type": "classify"}
{"input": "你会把这个描述推荐给哪方面的人?银行,社区,电商,支付,经营,卡牌,借贷,驾校,理财,职考,新闻,旅游,交通,魔幻,医疗,影像,动作,工具,体育,小说,运动,相机,工具,快递,教育,股票,菜谱,行车,仙侠,亲子,购物,射击,漫画,小学,同城,成人,求职,电子,艺术,赚钱,约会,经营,兼职,视频,音乐,英语,棋牌,摄影,养生,办公,政务,视频,论坛,彩票,直播,其他,休闲,策略,通讯,买车,违章,地图,民航,电台,语言,搞笑,婚恋,超市,养车,杂志,在线,家政,影视,装修,资讯,社交,餐饮,美颜,挂号,飞行,预定,票务,笔记,买房,外卖,母婴,打车,情侣,日程,租车,博客,百科,绘画,铁路,生活,租房,酒店,保险,问答,收款,竞技,唱歌,技术,减肥,工作,团购,记账,女性,公务,二手,美妆,汽车,行程,免费,教辅,两性,出国,婚庆,民宿快来施放属于你的寒冰魔法吧特殊效果雪花缓缓从上方飘落,手指触碰之处有冰魔法出现爱莎女王脱掉了封印魔法她的手套,在冰雪天地中建造了属于她一个人的辉煌宫殿。安娜中了冰魔法需要真爱之吻才能获救,最终姐妹二人齐心揭穿了异国王子的阴谋拯救了阿伦戴尔。解锁方法随意滑动屏幕一定距离后解锁要是觉得好玩,记得推荐给好朋友哦,,1.新增多张精美冰雪奇缘壁纸2.增加冰雪图钉,锁定当前壁纸功能3.内存,减小电量消耗\n答案:", "target": "休闲益智", "answer_choices": ["银行", "社区", "电商", "支付", "经营", "卡牌", "借贷", "驾校", "理财", "职考", "新闻", "旅游", "交通", "魔幻", "医疗", "影像", "动作", "工具", "体育", "小说", "运动", "相机", "工具", "快递", "教育", "股票", "菜谱", "行车", "仙侠", "亲子", "购物", "射击", "漫画", "小学", "同城", "成人", "求职", "电子", "艺术", "赚钱", "约会", "经营", "兼职", "视频", "音乐", "英语", "棋牌", "摄影", "养生", "办公", "政务", "视频", "论坛", "彩票", "直播", "其他", "休闲", "策略", "通讯", "买车", "违章", "地图", "民航", "电台", "语言", "搞笑", "婚恋", "超市", "养车", "杂志", "在线", "家政", "影视", "装修", "资讯", "社交", "餐饮", "美颜", "挂号", "飞行", "预定", "票务", "笔记", "买房", "外卖", "母婴", "打车", "情侣", "日程", "租车", "博客", "百科", "绘画", "铁路", "生活", "租房", "酒店", "保险", "问答", "收款", "竞技", "唱歌", "技术", "减肥", "工作", "团购", "记账", "女性", "公务", "二手", "美妆", "汽车", "行程", "免费", "教辅", "两性", "出国", "婚庆", "民宿"], "type": "classify"}
{"input": "阅读以下文章,并选择一个合适的成语。文章:\n赵宝刚导演表示,当看到温家宝总理在灾区安慰失去亲人__的孩子时,他再也控制不住自己的感情,不禁潸然泪下。他非常关心灾区的孤儿,目前正计划为孩子们做一些更有意义的事情。当记者问到是否会考虑日后拍一部地震题材的影片时,赵宝刚导演则明确表示自己更愿意为灾区做一些实事,目前正在积极了解灾区儿童的需要,为下一步援助工作做准备。\n 候选成语:忧心忡忡,提心吊胆,后顾之忧,土豪劣绅,叫苦不迭,用武之地,无计可施,明眸皓齿,孤立无援,步步为营。答案是:", "target": "孤立无援", "answer_choices": ["忧心忡忡", "提心吊胆", "后顾之忧", "土豪劣绅", "叫苦不迭", "用武之地", "无计可施", "明眸皓齿", "孤立无援", "步步为营"], "type": "mrc"}
{"input": "这是关于哪方面的新闻?黄埔军校老师有哪些?\n选项:故事,文化,娱乐,体育,财经,房产,汽车,教育,科技,军事,旅游,国际,股票,农业,游戏\n答案:", "target": "军事", "answer_choices": ["故事", "文化", "娱乐", "体育", "财经", "房产", "汽车", "教育", "科技", "军事", "旅游", "国际", "股票", "农业", "游戏"], "type": "classify"}
{"input": "这个是关于哪方面的App应用程序的描述?银行,社区,电商,支付,经营,卡牌,借贷,驾校,理财,职考,新闻,旅游,交通,魔幻,医疗,影像,动作,工具,体育,小说,运动,相机,工具,快递,教育,股票,菜谱,行车,仙侠,亲子,购物,射击,漫画,小学,同城,成人,求职,电子,艺术,赚钱,约会,经营,兼职,视频,音乐,英语,棋牌,摄影,养生,办公,政务,视频,论坛,彩票,直播,其他,休闲,策略,通讯,买车,违章,地图,民航,电台,语言,搞笑,婚恋,超市,养车,杂志,在线,家政,影视,装修,资讯,社交,餐饮,美颜,挂号,飞行,预定,票务,笔记,买房,外卖,母婴,打车,情侣,日程,租车,博客,百科,绘画,铁路,生活,租房,酒店,保险,问答,收款,竞技,唱歌,技术,减肥,工作,团购,记账,女性,公务,二手,美妆,汽车,行程,免费,教辅,两性,出国,婚庆,民宿“魅爱同城美女主动视频陪聊神器,女神绝密私照,一对一视频畅聊,保护你的私密。清纯的萌妹子、火辣的舞女郎,惊艳的时装秀,浪漫的午夜邂逅,伴你告别寂寞和美女主播视频聊天、交友、热舞零距离互动。让你随时随地享受偶遇的激情与惊喜与网红视频网红主播与你在线视频交友,浪漫邂逅。生活动态圈高颜值女神用短视频和照片与你分享生活中的点滴。\n答案:", "target": "约会社交", "answer_choices": ["银行", "社区", "电商", "支付", "经营", "卡牌", "借贷", "驾校", "理财", "职考", "新闻", "旅游", "交通", "魔幻", "医疗", "影像", "动作", "工具", "体育", "小说", "运动", "相机", "工具", "快递", "教育", "股票", "菜谱", "行车", "仙侠", "亲子", "购物", "射击", "漫画", "小学", "同城", "成人", "求职", "电子", "艺术", "赚钱", "约会", "经营", "兼职", "视频", "音乐", "英语", "棋牌", "摄影", "养生", "办公", "政务", "视频", "论坛", "彩票", "直播", "其他", "休闲", "策略", "通讯", "买车", "违章", "地图", "民航", "电台", "语言", "搞笑", "婚恋", "超市", "养车", "杂志", "在线", "家政", "影视", "装修", "资讯", "社交", "餐饮", "美颜", "挂号", "飞行", "预定", "票务", "笔记", "买房", "外卖", "母婴", "打车", "情侣", "日程", "租车", "博客", "百科", "绘画", "铁路", "生活", "租房", "酒店", "保险", "问答", "收款", "竞技", "唱歌", "技术", "减肥", "工作", "团购", "记账", "女性", "公务", "二手", "美妆", "汽车", "行程", "免费", "教辅", "两性", "出国", "婚庆", "民宿"], "type": "classify"}
{"input": "阅读理解:\n有一次,有人问马克·吐温是否记得他第一次是怎样挣到钱的。他想了很久,然后说:“对,我还记得很清楚,那是我在小学读书的时候。那时,小学生们都不尊重自己的老师,而且不爱惜学校的财产,经常弄坏桌椅。所以我们学校就定了一条规则,哪个学生用铅笔或小刀弄坏了桌椅,他就得在全校学生面前挨老师的打,或者交五元罚款。有一天,我弄坏了我的书桌,只好回家对父亲说,我违反了学校的规定,要么罚五元,要么在全校学生面前受到挨打的处分。父亲说当着全校学生的面挨打真是太丢脸了,他答应给我五块钱,让我交给学校。但是在给我这五块钱之前,他把我带到楼上,狠狠地打了我一顿。我想,既然我已经挨过一顿打了,那就干脆当着全校学生的面再挨一顿,这样就可以把那五块钱留下来。我真的这样做了,那就是我第一次挣到的钱。” \n问:父亲为什么给马克·吐温钱? 选项:喜欢他,奖励他,怕丢脸,感谢他\n答案:", "target": "怕丢脸", "type": "mrc", "answer_choices": ["喜欢他", "奖励他", "怕丢脸", "感谢他"]}
{"input": "“全面加强教师特别是农村教师培训,鼓励大学生、师范生到基层、农村任教”根据前面的段落,以下是否是真的“农村教师的培训需要特别重视”?是的,不是,或也许?\n答案:", "target": "是的", "answer_choices": ["是的", "不是", "也许"], "type": "nli"}
{"input": "给定“国民经济保持较快增长”我们应该假定“国民经济一个月内还会保持快速增长”是真的吗?是的,不是,或也许?\n答案:", "target": "也许", "answer_choices": ["是的", "不是", "也许"], "type": "nli"}
{"input": "这个是关于哪方面的App应用程序的描述?银行,社区,电商,支付,经营,卡牌,借贷,驾校,理财,职考,新闻,旅游,交通,魔幻,医疗,影像,动作,工具,体育,小说,运动,相机,工具,快递,教育,股票,菜谱,行车,仙侠,亲子,购物,射击,漫画,小学,同城,成人,求职,电子,艺术,赚钱,约会,经营,兼职,视频,音乐,英语,棋牌,摄影,养生,办公,政务,视频,论坛,彩票,直播,其他,休闲,策略,通讯,买车,违章,地图,民航,电台,语言,搞笑,婚恋,超市,养车,杂志,在线,家政,影视,装修,资讯,社交,餐饮,美颜,挂号,飞行,预定,票务,笔记,买房,外卖,母婴,打车,情侣,日程,租车,博客,百科,绘画,铁路,生活,租房,酒店,保险,问答,收款,竞技,唱歌,技术,减肥,工作,团购,记账,女性,公务,二手,美妆,汽车,行程,免费,教辅,两性,出国,婚庆,民宿移动吧是移动官方面向青海移动用户推出的移动智能终端网上营业厅。新版的移动吧为用户提供方便快捷的账单查询、业务办理、积分查询、通讯录等功能。随时随地尽享青海移动的贴心服务,方便触手可及。查询更丰富直观准确、消费透明充值更优惠专享优惠、充值赠费办理更便捷套餐流量、随时办理好友更亲密相互关注、贴心关怀活动更精彩活动不停、优惠不断更新内容1修复已知Bug;2优化客户端访问速度;3提升活动体验,丰富奖励资源。\n答案:", "target": "工具", "answer_choices": ["银行", "社区", "电商", "支付", "经营", "卡牌", "借贷", "驾校", "理财", "职考", "新闻", "旅游", "交通", "魔幻", "医疗", "影像", "动作", "工具", "体育", "小说", "运动", "相机", "工具", "快递", "教育", "股票", "菜谱", "行车", "仙侠", "亲子", "购物", "射击", "漫画", "小学", "同城", "成人", "求职", "电子", "艺术", "赚钱", "约会", "经营", "兼职", "视频", "音乐", "英语", "棋牌", "摄影", "养生", "办公", "政务", "视频", "论坛", "彩票", "直播", "其他", "休闲", "策略", "通讯", "买车", "违章", "地图", "民航", "电台", "语言", "搞笑", "婚恋", "超市", "养车", "杂志", "在线", "家政", "影视", "装修", "资讯", "社交", "餐饮", "美颜", "挂号", "飞行", "预定", "票务", "笔记", "买房", "外卖", "母婴", "打车", "情侣", "日程", "租车", "博客", "百科", "绘画", "铁路", "生活", "租房", "酒店", "保险", "问答", "收款", "竞技", "唱歌", "技术", "减肥", "工作", "团购", "记账", "女性", "公务", "二手", "美妆", "汽车", "行程", "免费", "教辅", "两性", "出国", "婚庆", "民宿"], "type": "classify"}
{"input": "足三两()是麦当劳推出的一种汉堡包,为继巨无霸后的另一招牌食品。英文名称的意思是「四分之一磅」,因为牛肉重量大约等如四分之一磅(烹调前计),而四分之一磅大约等于三两重,故在香港被称为「足-{}-三两」。在麦当劳于1975年进入香港市场时,Quarter Pounder曾被命名为「大汉-{}-堡」,而Quarter Pounder with Cheese则被命名为「大芝-{}-士汉-{}-堡」,但于1980年代后停售。2000年代初,曾经作为推广产品重新命名为「足-{}-三两」(或写作足-{}-三両),但推广期后便继续停售。直至2007年起,麦当劳在香港推出「Double足-{}-三两」(Double Quarter Pounder,即是双重份量的足-{}-三两)作为MacTonight套餐,于香港时间每晚21:00至翌日凌晨04:00间供应。由于反应理想,香港麦当劳于2009年将其发售时段提早至上午11时开始,并重新引入常规版的「足-{}-三两」作为长期发售的项目。Double足-{}-三两已于2017年初停售,常规版足-{}-三两亦于同年3月9日起停售。事实上,在香港售卖的「足-{}-三两」实际重量只有100克。香港麦当劳的餐牌上足-{}-三两及Double足-{}-三两都会以小字体加上「烹调前」标签,以符合香港海关《商品说明条例》的规定。一个正常的足三两,包括有四分之一磅(113.4克)牛肉(烹调前计)、两块芝麻面包、酸瓜、茄酱及生洋葱,而很多时候足三两也会有一块芝士。\n 从上面的段落中,根据一个合理的答案:麦当劳\n那么问题可能是:", "target": "足三两是哪个品牌的招牌食品之一?", "type": "mrc"}
{"input": "“切实转变工作作风”根据前面的段落,以下是否是真的“这是公文话语”?是的,不是,或也许?\n答案:", "target": "是的", "answer_choices": ["是的", "不是", "也许"], "type": "nli"}
{"input": "“逐步实行中等职业教育免费,今年先从农村家庭经济困难学生和涉农专业做起”记住上面的文字,考虑:“后年就能够全面实现中等职业教育免费”这是总是,绝不,或有时正确的?\n答案:", "target": "有时", "answer_choices": ["总是", "绝不", "有时"], "type": "nli"}
{"input": "阅读下列论文的摘要,然后生成这篇摘要的多个关键词。摘要:通过对泥河湾盆地43条剖面和6个钻孔晚新生代地层和微体古生物(介形类和有孔虫)的调查研究,发现非常丰富的介形类,计26属70余种,有孔虫4属4种,其中介形类自下而上可明显地划分为5个组合带:(1)Potamocyprisplana-Candoniella-Ilyocypris组合带;(2)Leucocythere-Ilyocypris-Candoniella组合带;(3)Leucocythere-Cytherissa-Limnocythere组合带;(4)Ilyocypris-Limnocythereflexa-Limnocytheredubiosa组合带;(5)Limnocytheredubiosa-Limnocytheresancti-Patricii-Ilyocypris组合带.按以上5个介形类组合带的分布,第1组合带及所含地层红崖村组和石匣组的时代为上新世;第2~4组合带及所含地层泥河湾组的时代为早更新世;第5组合带为中-晚更新世,分布于虎头梁组和许家窑组,虎头梁组置中更新世为宜,许家窑组为晚更新世.根据5个介形类组合带和有孔虫的分布及介形类的始现、繁盛、兴衰的演替特征,对泥河湾古湖和盆地的形成经历了上新世的起始,早更新世早期的扩展,中、晚期稳定、发展、湖面最大,中更新世向西部退缩和晚更新世消亡、桑干河水系形成五个发展阶段的演化进行了探讨.。摘要的关键词有这些:\n答案:", "target": "介形类,晚新生代,环境演化,生物地层", "answer_choices": "", "type": "generate"}
{"input": "这个App应用程序的描述会出现在哪个栏目?•只需随身携带手机即可随时了解您步行、跑步和骑车的运动情况。达成健身目标•设定时长或步数目标,并了解自己的进度。•获得根据健身效果提供的运动目标建议。全面掌握健身情况•将第三方设备和应用与Google健身关联后,您就可以在一个地方集中查看您的所有健身数据。随时随地使用•兼容所有AndroidWer设备。•还可以通过浏览器www.google.com/fit和平板电脑使用Google健身。更新内容提升体验,修复部分问题。\n选项:银行,社区,电商,支付,经营,卡牌,借贷,驾校,理财,职考,新闻,旅游,交通,魔幻,医疗,影像,动作,工具,体育,小说,运动,相机,工具,快递,教育,股票,菜谱,行车,仙侠,亲子,购物,射击,漫画,小学,同城,成人,求职,电子,艺术,赚钱,约会,经营,兼职,视频,音乐,英语,棋牌,摄影,养生,办公,政务,视频,论坛,彩票,直播,其他,休闲,策略,通讯,买车,违章,地图,民航,电台,语言,搞笑,婚恋,超市,养车,杂志,在线,家政,影视,装修,资讯,社交,餐饮,美颜,挂号,飞行,预定,票务,笔记,买房,外卖,母婴,打车,情侣,日程,租车,博客,百科,绘画,铁路,生活,租房,酒店,保险,问答,收款,竞技,唱歌,技术,减肥,工作,团购,记账,女性,公务,二手,美妆,汽车,行程,免费,教辅,两性,出国,婚庆,民宿\n答案:", "target": "运动健身", "answer_choices": ["银行", "社区", "电商", "支付", "经营", "卡牌", "借贷", "驾校", "理财", "职考", "新闻", "旅游", "交通", "魔幻", "医疗", "影像", "动作", "工具", "体育", "小说", "运动", "相机", "工具", "快递", "教育", "股票", "菜谱", "行车", "仙侠", "亲子", "购物", "射击", "漫画", "小学", "同城", "成人", "求职", "电子", "艺术", "赚钱", "约会", "经营", "兼职", "视频", "音乐", "英语", "棋牌", "摄影", "养生", "办公", "政务", "视频", "论坛", "彩票", "直播", "其他", "休闲", "策略", "通讯", "买车", "违章", "地图", "民航", "电台", "语言", "搞笑", "婚恋", "超市", "养车", "杂志", "在线", "家政", "影视", "装修", "资讯", "社交", "餐饮", "美颜", "挂号", "飞行", "预定", "票务", "笔记", "买房", "外卖", "母婴", "打车", "情侣", "日程", "租车", "博客", "百科", "绘画", "铁路", "生活", "租房", "酒店", "保险", "问答", "收款", "竞技", "唱歌", "技术", "减肥", "工作", "团购", "记账", "女性", "公务", "二手", "美妆", "汽车", "行程", "免费", "教辅", "两性", "出国", "婚庆", "民宿"], "type": "classify"}
{"input": "这个是关于哪方面的App应用程序的描述?银行,社区,电商,支付,经营,卡牌,借贷,驾校,理财,职考,新闻,旅游,交通,魔幻,医疗,影像,动作,工具,体育,小说,运动,相机,工具,快递,教育,股票,菜谱,行车,仙侠,亲子,购物,射击,漫画,小学,同城,成人,求职,电子,艺术,赚钱,约会,经营,兼职,视频,音乐,英语,棋牌,摄影,养生,办公,政务,视频,论坛,彩票,直播,其他,休闲,策略,通讯,买车,违章,地图,民航,电台,语言,搞笑,婚恋,超市,养车,杂志,在线,家政,影视,装修,资讯,社交,餐饮,美颜,挂号,飞行,预定,票务,笔记,买房,外卖,母婴,打车,情侣,日程,租车,博客,百科,绘画,铁路,生活,租房,酒店,保险,问答,收款,竞技,唱歌,技术,减肥,工作,团购,记账,女性,公务,二手,美妆,汽车,行程,免费,教辅,两性,出国,婚庆,民宿神秘又惊喜的万圣节到啦快来宝宝超市挑选你最爱的南瓜灯和面具吧还可以挑个礼服画个妆,打造超炫的万圣节造型呢和奇奇一起学会在超市购物,成为妈妈购物的好帮手吧丰富商品水果,蔬菜,玩具,零食…各种商品一应俱全模拟真实超市购物的场景,让宝宝体验超市购物的乐趣。根据清单购物你能帮妈妈买到清单上的东西吗对照清单购买需要的东西,让孩子有目的性的逛超市,帮宝宝树立正确的消费观。模拟结账别忘记结账哟~所有商品一共8元,付了10元,该找回多少钱呢,你能帮奇奇算一算吗丰富小游戏鱼缸捞鱼、搭配你喜欢的蛋糕、帮试妆员化上美丽的妆…丰富趣味小游戏,乐趣无穷宝宝巴士以孩子的兴趣启蒙为出发点,从健康、语言、社会、科学、艺术五大领域关注幼儿成长,吸取蒙氏教育精髓,根据幼儿不同年龄段左右脑发育、敏感期特点和学习重点来设计产品,打造“年龄+能力”的多元化产品体系。让孩子在游戏中独立思考,自由学习,享受探索世界的乐趣。宝宝巴士儿童早教pp,众多儿童早教产品的一致选择,孩子从小学宝宝巴士儿歌,贝瓦儿歌,儿歌点点,宝宝树,小伴龙,贝乐虎儿歌,咔哒故事,伴鱼绘本,宝宝手工零食,宝宝时尚设计师等使用者的一致推荐。设计理念宝宝巴士BbyBus,专注启蒙,而不仅仅是教育。我们专注于启发,而不只是学习。我们专注于能力培养,而不只是单一认知。我们专注于寓教于乐,而不是填鸭式教学。宝宝巴士,快乐启蒙全球3.5亿家庭用户的早教首选,您身边的幼儿教育专家搜索宝宝巴士,就可以下载宝宝巴士的所有早教APP了哦~欢迎联系微信宝宝巴士微博@宝宝巴士官网http//www.bbybus.com邮箱cn@bbybus.com更新内容不放过任何可以提升体验的地方,优化细节,让游戏体验更上一层楼贴心的小bug修复,提升稳定性和流畅度,畅玩无压力搜索宝宝巴士,就可以下载宝宝巴士的所有早教APP了哦~欢迎加入宝宝巴士官方Q群288190979,一起为孩子做更多更好的产品。\n答案:", "target": "亲子儿童", "answer_choices": ["银行", "社区", "电商", "支付", "经营", "卡牌", "借贷", "驾校", "理财", "职考", "新闻", "旅游", "交通", "魔幻", "医疗", "影像", "动作", "工具", "体育", "小说", "运动", "相机", "工具", "快递", "教育", "股票", "菜谱", "行车", "仙侠", "亲子", "购物", "射击", "漫画", "小学", "同城", "成人", "求职", "电子", "艺术", "赚钱", "约会", "经营", "兼职", "视频", "音乐", "英语", "棋牌", "摄影", "养生", "办公", "政务", "视频", "论坛", "彩票", "直播", "其他", "休闲", "策略", "通讯", "买车", "违章", "地图", "民航", "电台", "语言", "搞笑", "婚恋", "超市", "养车", "杂志", "在线", "家政", "影视", "装修", "资讯", "社交", "餐饮", "美颜", "挂号", "飞行", "预定", "票务", "笔记", "买房", "外卖", "母婴", "打车", "情侣", "日程", "租车", "博客", "百科", "绘画", "铁路", "生活", "租房", "酒店", "保险", "问答", "收款", "竞技", "唱歌", "技术", "减肥", "工作", "团购", "记账", "女性", "公务", "二手", "美妆", "汽车", "行程", "免费", "教辅", "两性", "出国", "婚庆", "民宿"], "type": "classify"}
{"input": "参考下面的段落,回答下列问题:\n段落:因吊钟的花朵通常在农历新年前后开花,故英文又名为Chinese New Year Flower,意即中国新年花。在清代中叶开始已有吊钟作为年花的习俗,取其「金钟一响,黄金万两」的吉兆,同时吊钟花的花朵都是生长在枝顶上,亦有高中科举之寓意,古时百姓因希望子弟能高中科举,就砍伐吊钟花带回家作为年花。不过近年因人们觉“吊钟”和“吊终”谐音,不吉利,所以较少人以吊钟作为年花。吊钟是一种落叶或半常绿灌木,可高约7米,但常高3米。树皮呈灰黄色,多分枝,小枝呈淡褐色。叶长圆形或倒卵状长圆形,先端渐尖,基部渐狭而成短柄,常密集生于枝顶,互生,革质,表面绿色而背面淡绿色,长5-10厘米,阔2-4厘米,全缘或顶部疏生细齿,叶两面无毛,侧脉6-7对,中脉两面清晰呈羽状伸出,网脉两面清晰,叶短柄长约5-20厘米,灰黄色呈圆柱状无毛。花为伞房花序顶生,花粉红色或红色,常5-8朵,下垂呈钟型,从枝顶覆瓦状排列的红色大苞片内生出,苞片长圆形或长方形,膜质,花梗绿色无毛,长约1.5-2厘米,花萼5裂,披针形先端披纤毛,长约2-4厘米,花冠呈宽钟状,口部5裂,裂片长约1-1.2厘米,裂片钝圆,轻微反卷白色,雄蕊8枚,雌蕊1枚,雌蕊较雄蕊长。果为蒴果,椭圆形无毛,淡黄色,具5梭,长约8-12厘米,果柄直立粗壮,长约3-5厘米。种子有3-5角或翅。喜温暖湿润,日光充足,土壤肥沃含腐殖质及排水良好的土壤。可以使用播种、扦插法及压条法繁殖。\n问题:吊钟花如何进行繁殖?\n答案:", "target": "播种、扦插法及压条法", "type": "mrc"}
{"input": "从医院打完针、开了药回来。母亲就赶到单位去上班了。走前,她把我托付给禾寡妇(候选词),请她(代词)关照我。。上面的句子中,代词“她”指代的是“禾寡妇”吗?选项:是的,不是。答案:", "target": "是的", "type": "anaphora_resolution", "answer_choices": ["是的", "不是"]}
{"input": "《1997年郡尉职权法案》()于1997年生效,是一项英国国会法案,来厘订大不列颠委任的郡尉(Lord Lieutenant)所管辖的地区。根据《1888年地方政府法案》,郡尉是被委派到每一个郡。可是,这个法案所定义的区域混杂了新的行政郡及郡的自治区。实际上,影响很微小,因为只有少数行政郡的边界跟原来的不一样。直到1965年大伦敦及亨廷登-彼得伯勒郡的成立,导致米德尔塞克斯郡尉办公室、伦敦郡郡尉办公室、亨廷登郡郡尉办公室被废除,取而代之就是大伦敦郡尉及亨廷登-彼得伯勒郡尉。1974年,英格兰及威尔斯内的行政郡及郡自治区被废除。一项大型改革也同时推行。所有郡尉辖区都被划分为都会郡和非都会郡。而1973年《苏格兰地方政府法案》则不跟从新的苏格兰地区来厘订郡尉辖区,反而从传统郡中拼合起来。因此,两者结合导致产生出来的郡尉辖区完全不跟从原有的郡。大部分这些郡尉辖区都没有留下来。在1990年代中期的英国地方政府改革中,很多非都会郡都开始重组成为单一管理区。苏格兰及威尔斯的地方政府过渡成为只由单一管理区所组成。这个时候开始草拟这个法案的计划,把郡尉辖区从地方政府再次分出来。虽然法案没有使用这个计划,但这些地方成了英格兰的名誉郡。\n 参考上述上下文,改革推行后,所有郡尉辖区被划分为什么?\n答案:", "target": "都会郡和非都会郡", "type": "mrc"}
{"input": "香港2004年继去年七一游行后再次经历了巨大政治争议,4月全国人民代表大会常务委员会第二次行使权力解释基本法,并否决了0708年双普选。5月,商业电台多名著名节目主持人指受到压力相继暂停节目,发生了「商台名嘴封咪事件」。7月1日,仍有数以十万计市民参与七一游行表达争取民主诉求。9月,第三届立法会选举刷新了历届投票纪录,有178万多人投票(投票率55.64%)。经济方面,去年发生沙士事件后情况逐渐改善,失业率下跌至2004年第四季的6.5%,是近三年以来的低位,年内本地生产总值增长8.1%,是自1987年以来的第二快增长,历时68个月的通缩终于结束,经济复苏主要受惠于东亚、欧美国等主要市场的强劲需求,以及中国内地对外贸易畅旺和内部需求殷切所带动。然而去年沙士期间,带来经济下滑以及增加开支,政府账目录得赤字401亿。下列节庆,如无注明,均是香港的公众假期,同时亦是法定假日(俗称劳工假期)。有 # 号者,不是公众假期或法定假日(除非适逢星期日或其它假期),但在商业炒作下,市面上有一定节庆气氛,传媒亦对其活动有所报导。详情可参看香港节日与公众假期。\n 从上面的段落中,根据一个合理的答案:受惠于东亚、欧美国等主要市场的强劲需求,以及中国内地对外贸易畅旺和内部需求殷切所带动。\n那么问题可能是:", "target": "香港2004年经济复苏的原因是什么?", "type": "mrc"}
{"input": "这是关于哪方面的新闻: 故事,文化,娱乐,体育,财经,房产,汽车,教育,科技,军事,旅游,国际,股票,农业,游戏?首次承认落后,美媒披露中国高超音速导弹技术领先美国\n答案:", "target": "军事", "answer_choices": ["故事", "文化", "娱乐", "体育", "财经", "房产", "汽车", "教育", "科技", "军事", "旅游", "国际", "股票", "农业", "游戏"], "type": "classify"}
{"input": "这是关于哪方面的新闻: 故事,文化,娱乐,体育,财经,房产,汽车,教育,科技,军事,旅游,国际,股票,农业,游戏?未来5年,教师会成为高收入人群吗?\n答案:", "target": "国际", "answer_choices": ["故事", "文化", "娱乐", "体育", "财经", "房产", "汽车", "教育", "科技", "军事", "旅游", "国际", "股票", "农业", "游戏"], "type": "classify"}
{"input": "阅读下面短文,从短文后给出的候选项中选出最佳选项。\n 新浪体育讯叠泉自开业以来,以其球场精良的设计、球会周到的服务,在业界的影响力不断提高,吸引了大批高尔夫爱好者慕名来到球会,这其中包括大家__的各界知名人士,政界、财经、实业、演艺界等有社会公众影响力的人物#idiom593805#。然而他们却拥有着很多共同点:他们都是社会各界的领袖精英;他们都在各自的领域颇有建树;他们都在接触叠泉后被其美丽而又富有挑战的场地所折服,#idiom593806#。 \n 候选项:神龙见首,各式各样,耳熟能详,不一而足,一应俱全,流连忘反,不胜枚举,沾沾自喜,一无所有,衣食住行。最佳选项是:", "target": "耳熟能详", "answer_choices": ["神龙见首", "各式各样", "耳熟能详", "不一而足", "一应俱全", "流连忘反", "不胜枚举", "沾沾自喜", "一无所有", "衣食住行"], "type": "mrc"}
{"input": "唐音是日本汉字音(音读)的一类。广义的「唐音」(唐宋音)指镰仓时代以后直至近代传入日本的汉字音,也就是明清时期的南方标准语「南京官话」。包含室町时代传入的「宋音」与狭义的「唐音」,即江户时代(明清)传入的汉字音。「唐音」的「唐」与「吴音」的「吴」和「汉音」的「汉」一样,并非指朝代,而是对中国的泛称。本文以论述狭义的唐音为主。江户时代传入的「唐音」与之前的「宋音」一样,主要限于佛典诵读及学问研究等,对一般用语的影响很小,仅限于特定的词语。唐音内部尚有不同的系统。就来源而言,大体分为以下三系。第一是隐元隆琦(福州府福清县人)于承应三年(1654)渡日后建立的黄檗宗所传承的用于诵读清规的明代音。第二是延宝五年(1677)渡日的曹洞宗心越派开祖心越兴俦(杭州人)所传的清规和琴谱(明乐)的诵读音。第三是江户时代的汉语学者(1674-1728)及韵镜学者文雄(1700-1763)等研究者通过长崎的通事(翻译官)等所学的中国音。有坂秀世氏将此三类分别称为黄檗唐音、心越系唐音和译官系唐音。这些音皆主要源于明末清初的南京官话音。相比于镰仓时代的宋音反映出更新的音韵变化。唐音由于母胎音的关系,带有明显的类似于现代官话和吴语发音的特色。甚至宕摄入声字也有的以エツ表示,如 阁ケツ。反映这些韵的韵腹为中母音。唐音的例词如下列举(此处一并列举可能为宋音的词)。椅子(イス) 蒲団(フトン) 行灯(アンドン) 行脚(アンギャ) 馅(アン)明(ミン) 清(シン) 普请(フシン) 白汤(パイタン) 石灰(シックイ) 馒头(マンジュウ)\n 从上面的段落中产生一个问题:", "target": "「唐音」的「唐」与「吴音」的「吴」和「汉音」的「汉」都指什么", "type": "mrc"}
{"input": "“还还没有,没有回来呢.”仅使用以上描述和你对世界所了解的,“有人还没有回来”是正确,错误,或未知?\n答案:", "target": "正确", "answer_choices": ["正确", "错误", "未知"], "type": "nli"}
{"input": "这些关键词“通用航空,导航系统,航图管理,航空器”代表了这篇论文的摘要:“为满足通用航空器对结构简单、价格低廉的导航系统的需求,提出一种机载便携式导航系统方案。系统以航路图作为背景,通过标定技术实现航图像素坐标与经纬度坐标的配准,并通过对航图的分割与四叉树管理,降低了对设备内存的需求,随着航空器位置更新,系统通过平移、旋转航图实现对航空器的导航。仿真实验结果表明,航空器在航图上定位精确,系统对于航图的平移、旋转响应准确,便携式导航系统可以满足通用航空器导航的需求,对通航飞行安全提供了一定的技术支持。”。这是正确的吗?\n选项:是的,不是\n答案:", "target": "不是", "answer_choices": ["是的", "不是"], "type": "classify"}
{"input": "根据短文内容,选出缺少的成语填在下划线处。\n 梅柏肯__。“你未经我的许可就擅自结婚,对我而言,要废除这个婚姻#idiom588293#。”他的眼睛闪着微光。“事实上,我相信你会发现登记你们结婚的记录员已经神秘失踪,而替你们主持婚礼的牧师已搬到法国。你想要证明自己结了婚恐怕是难上加难。” \n 候选成语:借花献佛,嗤之以鼻,易如反掌,投桃报李,求之不得,大失所望,虚位以待,无人之境,喜出望外,落井下石。 正确答案是:", "target": "嗤之以鼻", "answer_choices": ["借花献佛", "嗤之以鼻", "易如反掌", "投桃报李", "求之不得", "大失所望", "虚位以待", "无人之境", "喜出望外", "落井下石"], "type": "mrc"}
{"input": "这是关于哪方面的新闻?买家付了款却没有购房资格,卖家能解除房屋买卖合同吗?\n选项:故事,文化,娱乐,体育,财经,房产,汽车,教育,科技,军事,旅游,国际,股票,农业,游戏\n答案:", "target": "房产", "answer_choices": ["故事", "文化", "娱乐", "体育", "财经", "房产", "汽车", "教育", "科技", "军事", "旅游", "国际", "股票", "农业", "游戏"], "type": "classify"}
{"input": "阅读短文:\n 方宏进在与律师商量后决定于今日将__于天下。方宏进昨日接受了个别媒体的电话采访,并不避讳自己现在很麻烦。据悉,方宏进身上牵扯的官司不止此次今麦郎这一起,之前还和多家企业发生矛盾,精通金融知识的他一直希望在商业场上大展拳脚,加之其之前央视名嘴的身份,他一直坚信自己能成功。不过,成立了北京澳卫时代广告公司(简称澳卫)的他生意方面却不顺利,记者昨日得悉,该公司已被吊销了营业执照,公司原址也已易主。记者从方宏进一位朋友那边了解到,方宏进经常用酒精麻痹自己,日前接受记者电话采访,还用一起喝酒来“打掩护”,拒绝回应实质性内容。 \n 从候选成语“扫地出门,一网打尽,顺藤摸瓜,狗血喷头,真相大白,走投无路,逍遥法外,治病救人,东窗事发,名正言顺”中选出最适合填在下划线处的成语。正确答案是:", "target": "真相大白", "answer_choices": ["扫地出门", "一网打尽", "顺藤摸瓜", "狗血喷头", "真相大白", "走投无路", "逍遥法外", "治病救人", "东窗事发", "名正言顺"], "type": "mrc"}
{"input": "“也是作践你自己,好歹我总是你的女儿”我们这样说有道理吗“我是你的女儿改变不了”?是的,不是,或也许?\n答案:", "target": "是的", "answer_choices": ["是的", "不是", "也许"], "type": "nli"}
{"input": "阅读以下文章,并选择一个合适的成语。文章:\n新浪娱乐讯一向在银幕上保持文艺、内敛气质的黄璐,近日在最新写真中彰显出自身阳光、青春的一面,粉色系运动装扮搭配__的绿茵场背景,如夏日般朝气蓬勃的年轻气息扑面而来,吸引众人目光。\n 候选成语:郁郁葱葱,万家灯火,高楼大厦,车水马龙,欣欣向荣,浮光掠影,东西南北,乔装打扮,下里巴人,四通八达。答案是:", "target": "郁郁葱葱", "answer_choices": ["郁郁葱葱", "万家灯火", "高楼大厦", "车水马龙", "欣欣向荣", "浮光掠影", "东西南北", "乔装打扮", "下里巴人", "四通八达"], "type": "mrc"}
{"input": "阅读以下对话并回答问题。\n女:今天已经三月十五号了,那个调研报告什么时候可以完成?男:下个月中旬应该可以。问题:男的打算什么时候完成报告?选项:3月初,3月15号,4月中旬,4月底\n答案:", "target": "4月中旬", "answer_choices": ["3月初", "3月15号", "4月中旬", "4月底"], "type": "mrc"}
{"input": "阅读下列论文摘要,然后判断下面的这些关键词是否都是论文摘要合适的关键词?\n摘要:集成多跳中继技术的WiMAXMesh网络中,当发送功率和信道数目一定时,用户接入链路的传输速率直接取决于用户到中继的距离.在满足用户到中继距离要求的条件下,研究最少中继部署问题具有保证网络性能、降低组网成本的意义.文中将该问题转化为最少团划分问题,基于用户邻居信息提出启发式算法MAXDCP,基于用户位置信息提出启发式算法GEOCP.模拟结果表明:与该问题的最新算法MIS相比,在相同时间复杂度下,MAXDCP部署中继的个数平均减少23.8%,GEOCP平均减少35%;与已有PTAS算法HS相比,GEOCP部署中继个数平均减少18.5%,且时间复杂度更低.MAXDCP和GEOCP很好地保证了网络性能、降低了组网成本.\n关键词:问题,信息,中继,组网。答案是:\n选项:是的,不是\n答案:", "target": "不是", "answer_choices": ["是的", "不是"], "type": "classify"}
{"input": "哪个类别最好的描述了这篇新闻?芦淞区档案史志局指导档案规范化管理工作\n选项:故事,文化,娱乐,体育,财经,房产,汽车,教育,科技,军事,旅游,国际,股票,农业,游戏\n答案:", "target": "财经", "answer_choices": ["故事", "文化", "娱乐", "体育", "财经", "房产", "汽车", "教育", "科技", "军事", "旅游", "国际", "股票", "农业", "游戏"], "type": "classify"}
{"input": "根据短文内容,选出缺少的成语填在下划线处。\n 慢慢地,“朝圣”变成对亚洲无法满足的好奇,而不是倒拨世纪之钟的时针,寻觅历史的源头。于是,他想到哪儿就到哪儿,不管亚历山大大帝是不是到过那个地方。他骑马翻过东土耳其的__,看见积雪覆盖着山坡,从撒哈拉大沙漠#idiom598242#吹来的黄沙,又将那山坡变成粉红色。现在,让他#idiom598243#的是,大自然神奇的力量和人类如何面对大自然、改造大自然。 \n 候选成语:崇山峻岭,冰天雪地,肃然起敬,一望无际,翻山越岭,各抒己见,一马平川,玄之又玄,开诚布公,成年累月。 正确答案是:", "target": "崇山峻岭", "answer_choices": ["崇山峻岭", "冰天雪地", "肃然起敬", "一望无际", "翻山越岭", "各抒己见", "一马平川", "玄之又玄", "开诚布公", "成年累月"], "type": "mrc"}
{"input": "摘要:为了解汉族民间童帽所隐含的民俗审美及民俗文化,以江南大学民间服饰传习馆藏品为研究对象,通过实物归纳法对其装饰用色、图案、配件,以及装饰元素的布局特点、装饰纹样造型特点进行分析研究.结果表明:近代汉族民间童帽装饰元素丰富,充满童趣,形成了自己的装饰规范,较其他类服饰更具特色;童帽装饰元素与民间生活密切相关,并非偶然形成.其丰富的文化内涵为研究与儿童相关的民俗风俗提供参考,为儿童服饰设计提供了丰富的素材.\n 以下的关键词都是这篇摘要合适的关键词吗?关键词:童帽,图案,装饰。答案是:\n选项:是的,不是\n答案:", "target": "不是", "answer_choices": ["是的", "不是"], "type": "classify"}
{"input": "给定“王琦瑶嘴里说抱歉的话,心里却想:严师母的意思其实是说她不识抬举”保证是真实的吗“王琦瑶在心里反思以后该怎么做的更好”?是的,不是,或也许?\n答案:", "target": "不是", "answer_choices": ["是的", "不是", "也许"], "type": "nli"}
{"input": "给定“当然了,当然我这身材等于男模横着放,所以我不走秀,我坐秀”保证是真实的吗““我”喜欢坐着不爱动”?是的,不是,或也许?\n答案:", "target": "也许", "answer_choices": ["是的", "不是", "也许"], "type": "nli"}
{"input": "哪个类别最好的描述了这篇新闻?魅力乡村|忻州岢岚宋家沟村新貌\n选项:故事,文化,娱乐,体育,财经,房产,汽车,教育,科技,军事,旅游,国际,股票,农业,游戏\n答案:", "target": "旅游", "answer_choices": ["故事", "文化", "娱乐", "体育", "财经", "房产", "汽车", "教育", "科技", "军事", "旅游", "国际", "股票", "农业", "游戏"], "type": "classify"}
{"input": "\n段落:日本传统歌舞剧场有一条奇特的规定:观众即使看到入迷处,也只能心领神会,而不准喝彩,否则会被他人侧目而视。而台下寥寥无几的喝彩者则是剧院特邀的职业喝彩师,受过专门的喝彩训练,熟谙什么时候用什么方式喝彩,以便同台上的演员上下呼应,使演出更加趣味盎然。这些职业喝彩师多为男性,社会地位颇高,著名的喝彩大师甚至同演员齐名。他们可以自由出入剧场,坐特等包厢,有的剧团和剧院还特邀大名鼎鼎的喝彩大师光临以抬高身价。自然,喝彩大师领取的报酬也很高。不过,现在日本的喝彩师已越来越少,因而培养职业喝彩师已成为日本传统歌舞的当务之急。 \n问:目前急需解决的是什么? 选项:邀请喝彩大师,抬高喝彩大师身份,喝彩大师能自由出入,尽快培养职业喝彩师 \n答案:", "target": "尽快培养职业喝彩师", "type": "mrc", "answer_choices": ["邀请喝彩大师", "抬高喝彩大师身份", "喝彩大师能自由出入", "尽快培养职业喝彩师"]}
{"input": "摘要:针对采用一次二阶矩法计算复杂、高度非线性功能函数的可靠指标时,求解功能函数对随机变量的偏导数极其困难,并且偏导数形式非常复杂等问题,提出用响应面函数代替原功能函数的方法,使其求导过程方便,并且使偏导数形式转化为随机变量的线性表达式,便于程序化求解.然后以计算三维Hoek-Brown强度准则的可靠度为例,确认响应面法在复杂、高度非线性功能函数可靠度计算中的可行性,并与变量代换法和复合函数求导法则的计算结果进行比较,说明利用响应面法计算的结果具有较高的精度.最后,用响应面法分析强度准则参数分布类型和岩体参数之间的相关性对三维Hoek-Brown准则可靠度的影响规律.研究结果表明:该方法具有较高精度;强度准则参数分布类型对可靠指标的敏感性较弱;岩体参数的负相关系数与可靠指标线性相关,对可靠指标的影响不大.\n 以下的关键词都是这篇摘要合适的关键词吗?关键词:Hoek-Brown准则,功能,响应面法。答案是:\n选项:是的,不是\n答案:", "target": "不是", "answer_choices": ["是的", "不是"], "type": "classify"}
{"input": "以下两句话的意思相同的吗?“怎么我的蚂蚁借呗不能用了”,“怎么我不能使用蚂蚁借呗”。选项:是的,不是。答案:", "target": "是的", "answer_choices": ["是的", "不是"], "type": "classify"}
{"input": "“现在婴儿的健康状况仍很严重”记住上面的文字,考虑:“婴儿已经完全康复了。”这是总是,绝不,或有时正确的?\n答案:", "target": "绝不", "answer_choices": ["总是", "绝不", "有时"], "type": "nli"}
{"input": "这是一个成语填空任务。上文是:早上锻炼还可以提高你一天的。 \n下文是:,所以调整一下作息时间,早起30分钟,锻炼一下吧。导语:如果你2011年的计划之一是减肥,希望你在1号的时候没有满脑子想着“从明天开始”减肥没有捷径,但是可以有“jumpstart”,就是一个见效快的开始。那些“常年”减肥的女性朋友们,都应当知道减肥最难得是后期的坚持和养成一个健康的生活方式。\n候选的成语:安然无恙,误打误撞,起死回生,新陈代谢,故态复萌,自食其力,死里逃生,因祸得福,返老还童,开山祖师。请问:我们应该填写哪个成语?\n答案:", "target": "新陈代谢", "answer_choices": ["安然无恙", "误打误撞", "起死回生", "新陈代谢", "故态复萌", "自食其力", "死里逃生", "因祸得福", "返老还童", "开山祖师"], "type": "mrc"}
{"input": "阅读以下段落:\n我想找个演外国旧片的影院,走了两家都满座。走到一家剧场,有人迎上来问我要不要退票。我只肯出一张电影票的价,那人踌躇一下,索性把票子白送给我,我进剧场时不禁有些怀疑。剧场里只有稀稀拉拉儿个观众,台上一个古装少女在跳着徐缓但十分舒展的中国古典舞。水袖在淡蓝的光中拖来曳去,腰肢婀娜地扭动,筝和琵琶流水般地倾泻,天幕一片辽远清丽的冷调子。曲终舞罢,灯光暗下来。尽管我很入迷,也没鼓掌。舞台再次亮起来时,这个姑娘穿得很少地跳出来。跳了一会儿我才明白,她跳的是一个神话中的女英雄。在共工那个倒霉蛋头触不周山、造成__的严重后果后,这个女人像瓦匠一样把天重新砌好,使我们人类得以继续繁衍。据说,也是这个女人,同她的同胞交尾产卵,提供了第一批人种。值得欣慰的是编导没让这个女孩子裹上一层蛇皮,否则,她就不能向我们展现她那双极富表现力、#idiom598598#的腿。最后,我还是觉得扫兴。我以为不该让一个女孩子向成年人表现雄壮、慈悲,即使她是好心眼。我对这个女孩子印象深刻,因为她表现#idiom598599#后接踵而来的死亡很传神,简直可以说死得#idiom598600#。\n其中下划线处需要填写成语,有以下候选项:生气勃勃,洋洋得意,明媒正娶,怨气冲天,内忧外患,阒其无人,功成名遂,祸从天降,祸不单行,天塌地陷。下划线处合适的成语是:", "target": "天塌地陷", "answer_choices": ["生气勃勃", "洋洋得意", "明媒正娶", "怨气冲天", "内忧外患", "阒其无人", "功成名遂", "祸从天降", "祸不单行", "天塌地陷"], "type": "mrc"}
{"input": "这个是关于哪方面的App应用程序的描述?银行,社区,电商,支付,经营,卡牌,借贷,驾校,理财,职考,新闻,旅游,交通,魔幻,医疗,影像,动作,工具,体育,小说,运动,相机,工具,快递,教育,股票,菜谱,行车,仙侠,亲子,购物,射击,漫画,小学,同城,成人,求职,电子,艺术,赚钱,约会,经营,兼职,视频,音乐,英语,棋牌,摄影,养生,办公,政务,视频,论坛,彩票,直播,其他,休闲,策略,通讯,买车,违章,地图,民航,电台,语言,搞笑,婚恋,超市,养车,杂志,在线,家政,影视,装修,资讯,社交,餐饮,美颜,挂号,飞行,预定,票务,笔记,买房,外卖,母婴,打车,情侣,日程,租车,博客,百科,绘画,铁路,生活,租房,酒店,保险,问答,收款,竞技,唱歌,技术,减肥,工作,团购,记账,女性,公务,二手,美妆,汽车,行程,免费,教辅,两性,出国,婚庆,民宿界面简洁清晰,没有多余的装饰,方便您更加直观的查阅分析各彩种信息动态。主推时下热门彩种的开奖信息、历史开奖、走势分析、预测选号、彩种排行等。是您分析走势的必备工具。,,提升体验,修复部分问题。\n答案:", "target": "彩票", "answer_choices": ["银行", "社区", "电商", "支付", "经营", "卡牌", "借贷", "驾校", "理财", "职考", "新闻", "旅游", "交通", "魔幻", "医疗", "影像", "动作", "工具", "体育", "小说", "运动", "相机", "工具", "快递", "教育", "股票", "菜谱", "行车", "仙侠", "亲子", "购物", "射击", "漫画", "小学", "同城", "成人", "求职", "电子", "艺术", "赚钱", "约会", "经营", "兼职", "视频", "音乐", "英语", "棋牌", "摄影", "养生", "办公", "政务", "视频", "论坛", "彩票", "直播", "其他", "休闲", "策略", "通讯", "买车", "违章", "地图", "民航", "电台", "语言", "搞笑", "婚恋", "超市", "养车", "杂志", "在线", "家政", "影视", "装修", "资讯", "社交", "餐饮", "美颜", "挂号", "飞行", "预定", "票务", "笔记", "买房", "外卖", "母婴", "打车", "情侣", "日程", "租车", "博客", "百科", "绘画", "铁路", "生活", "租房", "酒店", "保险", "问答", "收款", "竞技", "唱歌", "技术", "减肥", "工作", "团购", "记账", "女性", "公务", "二手", "美妆", "汽车", "行程", "免费", "教辅", "两性", "出国", "婚庆", "民宿"], "type": "classify"}
{"input": "带着问题来阅读文章并回答问题:\n问:教授想说明什么道理? \n选项:装满杯子可以有多种方式,如何去解决生活中的问题,人生必须要实现一些目标,别让烦恼和忧郁占据生活 \n段落:一位教授在一个空杯子里装满大石块,又倒进一些小石子,并轻轻摇动杯子,让小石子滚进石块之间的空隙;然后教授拿出一些沙子倒进杯子,摇动杯子,把小石子间的空隙都填满;最后他又往杯子里倒水,把杯子所有的空间都填满。做完这些,教授对学生们说:“现在,我想让大家把这个杯子理解为生活。里面的大石块代表生命中最珍贵的东西,比如说家庭、伴侣、健康、孩子等等,所有这些对我们来说都极为重要,一旦失去将永远无法弥补;小石子代表生命中较为重要的东西,如工作、房子、车子等等;沙子代表生命中的日常小事;水代表烦恼、忧郁。请记住,如果我们先把水和沙子装进杯子,那就没有空间去装大石块和小石子了。”\n答案:", "target": "别让烦恼和忧郁占据生活", "type": "mrc", "answer_choices": ["装满杯子可以有多种方式", "如何去解决生活中的问题", "人生必须要实现一些目标", "别让烦恼和忧郁占据生活"]}
{"input": "对话:男:欢迎你,刘经理,好久不见了。女:是啊,如果不是因为工作,我们还真是难得见一次面。男:这次我要好好儿请你吃个饭,上次你走得太急了。女:那就太谢谢你了。问题:他们可能是什么关系?选项:夫妻,朋友,师生\n答案:", "target": "朋友", "answer_choices": ["夫妻", "朋友", "师生"], "type": "mrc"}
{"input": "阅读文章:\n“没关系,”他尽量__地说,“我也迟到了。杰克和米莉。布坎南打架了,我正要走的时候他来到我家。我给他吃了一杯酒,打发他上床了。”他为她倒了一杯酒,可她没有接杯子。“他就是你办公室的那位吗?我是说,在卡尔参议员办公室工作的那位吗?”她虽然没见过他的同事,但是他们的\n其中下划线的地方需要填写成语,有以下候选的成语:心平气和,以理服人,认祖归宗,开诚布公,依然故我,生吞活剥,和颜悦色,将心比心,不动声色,一本正经。正确的成语是:", "target": "心平气和", "answer_choices": ["心平气和", "以理服人", "认祖归宗", "开诚布公", "依然故我", "生吞活剥", "和颜悦色", "将心比心", "不动声色", "一本正经"], "type": "mrc"}
{"input": "这是关于哪方面的新闻?有哪些娱乐圈里面的明星追星?\n选项:故事,文化,娱乐,体育,财经,房产,汽车,教育,科技,军事,旅游,国际,股票,农业,游戏\n答案:", "target": "娱乐", "answer_choices": ["故事", "文化", "娱乐", "体育", "财经", "房产", "汽车", "教育", "科技", "军事", "旅游", "国际", "股票", "农业", "游戏"], "type": "classify"}
{"input": "摘要:提应用常规观测资料、NCEP再分析资料,对比分析了山东两次春季黄淮气旋暴雨落区异同点。发现春季影响山东的黄淮气旋暴雨区集中出现在气旋中心北侧的偏东风中,且主要位于东北气流中。暴雨区偏北的程度,与影响系统的后倾程度及我国东北地区是否存在高压有关。当系统明显后倾时,锋面坡度小,暖湿气流沿锋面向北爬升的更远,暴雨区更偏北;当我国东北地区存在高压时,其南侧东北气流经渤海侵入850hPa低涡后部,与低涡前东南气流在风向上渐近辐合,在低涡北侧产生辐合中心,从而产生暴雨区。此外,地面东北风形成的冷垫,有利于南方暖湿气流向北爬升。实际暴雨落区预报中,需综合分析系统的空间结构、周围系统的影响及温度场的配置等。 \n关键词:hPa低涡,5,暴雨落区,系统空间结构。请问:上面的关键词都是这篇摘要合适的关键词吗?\n选项:是的,不是\n答案:", "target": "是的", "answer_choices": ["是的", "不是"], "type": "classify"}
### 使用pCLUE数据集进行模型训练
* 使用pCLUE数据集在colab上进行训练、预测和效果验证, pytorch实现
[](https://colab.research.google.com/drive/1QIQDWAACkV7-iRrkrk18XrRjEekMhOtv?usp=sharing)
|
squaredev/translated_alpaca_tasks_gr | ---
license: apache-2.0
task_categories:
- question-answering
- conversational
- text-generation
language:
- el
pretty_name: Translated alpaca tasks in Greek
size_categories:
- 10K<n<100K
---
Alpaca tasks dataset translated in Greek from GPT3.5
Translation is done in chunks of 10K. |
LimYeri/LeetCode_with_Solutions | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question_content
dtype: string
- name: title_slug
dtype: string
- name: tag
dtype: string
- name: level
dtype: string
- name: question_hints
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 119458837
num_examples: 34903
download_size: 40227362
dataset_size: 119458837
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: mit
task_categories:
- text-generation
language:
- en
tags:
- code
pretty_name: LeetCode with Solutions
size_categories:
- 10K<n<100K
---
datasets:
- [LimYeri/LeetCode_YT_CC_CoT_Summary] (https://huggingface.co/datasets/LimYeri/LeetCode_YT_CC_CoT_Summary)
- [kreimben/leetcode_user_submissions] (https://huggingface.co/datasets/kreimben/leetcode_user_submissions)
- [greengerong/leetcode] (https://huggingface.co/datasets/greengerong/leetcode) |
SinclairWang/test | ---
license: cc-by-nc-4.0
---
|
karandomguy/news-article-to-bollywood-song | ---
license: mit
---
|
yzhuang/autotree_snnxor_n30_l1_2 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: input_y_clean
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 154520000
num_examples: 10000
- name: validation
num_bytes: 154520000
num_examples: 10000
- name: test
num_bytes: 154520000
num_examples: 10000
download_size: 185693440
dataset_size: 463560000
---
# Dataset Card for "autotree_snnxor_n30_l1_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
benayas/atis_nlpaug_20pct_v2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 417249
num_examples: 4455
download_size: 177612
dataset_size: 417249
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Icchan/LohHeHelo | ---
license: mit
---
|
PL-MTEB/ppc-pairclassification | ---
license: cc-by-nc-sa-4.0
---
|
open-llm-leaderboard/details_davidkim205__Rhea-72b-v0.2 | ---
pretty_name: Evaluation run of davidkim205/Rhea-72b-v0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [davidkim205/Rhea-72b-v0.2](https://huggingface.co/davidkim205/Rhea-72b-v0.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_davidkim205__Rhea-72b-v0.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-24T07:54:02.522577](https://huggingface.co/datasets/open-llm-leaderboard/details_davidkim205__Rhea-72b-v0.2/blob/main/results_2024-03-24T07-54-02.522577.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7806390565791278,\n\
\ \"acc_stderr\": 0.027711233842223508,\n \"acc_norm\": 0.7818850286033018,\n\
\ \"acc_norm_stderr\": 0.02826655133417084,\n \"mc1\": 0.6487148102815178,\n\
\ \"mc1_stderr\": 0.0167113581635444,\n \"mc2\": 0.7450059289802828,\n\
\ \"mc2_stderr\": 0.014438907380750043\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7568259385665529,\n \"acc_stderr\": 0.012536554144587087,\n\
\ \"acc_norm\": 0.7755972696245734,\n \"acc_norm_stderr\": 0.01219140493860383\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7667795259908384,\n\
\ \"acc_stderr\": 0.00422017156927332,\n \"acc_norm\": 0.9083847839075881,\n\
\ \"acc_norm_stderr\": 0.0028789243105734504\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.725925925925926,\n\
\ \"acc_stderr\": 0.038532548365520045,\n \"acc_norm\": 0.725925925925926,\n\
\ \"acc_norm_stderr\": 0.038532548365520045\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.875,\n \"acc_stderr\": 0.026913523521537846,\n \
\ \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.026913523521537846\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n\
\ \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.8,\n \
\ \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8452830188679246,\n \"acc_stderr\": 0.022257075558791282,\n\
\ \"acc_norm\": 0.8452830188679246,\n \"acc_norm_stderr\": 0.022257075558791282\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9375,\n\
\ \"acc_stderr\": 0.02024219611347799,\n \"acc_norm\": 0.9375,\n \
\ \"acc_norm_stderr\": 0.02024219611347799\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7745664739884393,\n\
\ \"acc_stderr\": 0.031862098516411454,\n \"acc_norm\": 0.7745664739884393,\n\
\ \"acc_norm_stderr\": 0.031862098516411454\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.049665709039785295,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.049665709039785295\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n\
\ \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.8212765957446808,\n \"acc_stderr\": 0.02504537327205098,\n\
\ \"acc_norm\": 0.8212765957446808,\n \"acc_norm_stderr\": 0.02504537327205098\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6052631578947368,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.6052631578947368,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7862068965517242,\n \"acc_stderr\": 0.034165204477475494,\n\
\ \"acc_norm\": 0.7862068965517242,\n \"acc_norm_stderr\": 0.034165204477475494\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.7222222222222222,\n \"acc_stderr\": 0.02306818884826113,\n \"\
acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.02306818884826113\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5317460317460317,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.5317460317460317,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8806451612903226,\n \"acc_stderr\": 0.018443411325315413,\n \"\
acc_norm\": 0.8806451612903226,\n \"acc_norm_stderr\": 0.018443411325315413\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.6748768472906403,\n \"acc_stderr\": 0.032957975663112704,\n \"\
acc_norm\": 0.6748768472906403,\n \"acc_norm_stderr\": 0.032957975663112704\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\"\
: 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781675,\n\
\ \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781675\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9343434343434344,\n \"acc_stderr\": 0.01764652667723332,\n \"\
acc_norm\": 0.9343434343434344,\n \"acc_norm_stderr\": 0.01764652667723332\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9844559585492227,\n \"acc_stderr\": 0.008927492715084315,\n\
\ \"acc_norm\": 0.9844559585492227,\n \"acc_norm_stderr\": 0.008927492715084315\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8102564102564103,\n \"acc_stderr\": 0.019880165406588768,\n\
\ \"acc_norm\": 0.8102564102564103,\n \"acc_norm_stderr\": 0.019880165406588768\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4925925925925926,\n \"acc_stderr\": 0.030482192395191506,\n \
\ \"acc_norm\": 0.4925925925925926,\n \"acc_norm_stderr\": 0.030482192395191506\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8613445378151261,\n \"acc_stderr\": 0.022448264476832576,\n\
\ \"acc_norm\": 0.8613445378151261,\n \"acc_norm_stderr\": 0.022448264476832576\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5894039735099338,\n \"acc_stderr\": 0.04016689594849929,\n \"\
acc_norm\": 0.5894039735099338,\n \"acc_norm_stderr\": 0.04016689594849929\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9339449541284404,\n \"acc_stderr\": 0.01064913148785894,\n \"\
acc_norm\": 0.9339449541284404,\n \"acc_norm_stderr\": 0.01064913148785894\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.7083333333333334,\n \"acc_stderr\": 0.03099866630456053,\n \"\
acc_norm\": 0.7083333333333334,\n \"acc_norm_stderr\": 0.03099866630456053\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658928,\n \"\
acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658928\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9071729957805907,\n \"acc_stderr\": 0.018889750550956715,\n \
\ \"acc_norm\": 0.9071729957805907,\n \"acc_norm_stderr\": 0.018889750550956715\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8071748878923767,\n\
\ \"acc_stderr\": 0.026478240960489365,\n \"acc_norm\": 0.8071748878923767,\n\
\ \"acc_norm_stderr\": 0.026478240960489365\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8473282442748091,\n \"acc_stderr\": 0.0315452167200547,\n\
\ \"acc_norm\": 0.8473282442748091,\n \"acc_norm_stderr\": 0.0315452167200547\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540616,\n \"\
acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540616\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n\
\ \"acc_stderr\": 0.03520703990517962,\n \"acc_norm\": 0.8425925925925926,\n\
\ \"acc_norm_stderr\": 0.03520703990517962\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8711656441717791,\n \"acc_stderr\": 0.026321383198783656,\n\
\ \"acc_norm\": 0.8711656441717791,\n \"acc_norm_stderr\": 0.026321383198783656\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6607142857142857,\n\
\ \"acc_stderr\": 0.044939490686135404,\n \"acc_norm\": 0.6607142857142857,\n\
\ \"acc_norm_stderr\": 0.044939490686135404\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.03393295729761011,\n\
\ \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.03393295729761011\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9444444444444444,\n\
\ \"acc_stderr\": 0.015006312806446912,\n \"acc_norm\": 0.9444444444444444,\n\
\ \"acc_norm_stderr\": 0.015006312806446912\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263734,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263734\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9208173690932312,\n\
\ \"acc_stderr\": 0.009656024044324226,\n \"acc_norm\": 0.9208173690932312,\n\
\ \"acc_norm_stderr\": 0.009656024044324226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8439306358381503,\n \"acc_stderr\": 0.019539014685374036,\n\
\ \"acc_norm\": 0.8439306358381503,\n \"acc_norm_stderr\": 0.019539014685374036\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.8011173184357542,\n\
\ \"acc_stderr\": 0.013349892983092517,\n \"acc_norm\": 0.8011173184357542,\n\
\ \"acc_norm_stderr\": 0.013349892983092517\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8594771241830066,\n \"acc_stderr\": 0.01989943546353996,\n\
\ \"acc_norm\": 0.8594771241830066,\n \"acc_norm_stderr\": 0.01989943546353996\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8585209003215434,\n\
\ \"acc_stderr\": 0.019794326658090555,\n \"acc_norm\": 0.8585209003215434,\n\
\ \"acc_norm_stderr\": 0.019794326658090555\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8734567901234568,\n \"acc_stderr\": 0.018498600558790906,\n\
\ \"acc_norm\": 0.8734567901234568,\n \"acc_norm_stderr\": 0.018498600558790906\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6560283687943262,\n \"acc_stderr\": 0.028338017428611334,\n \
\ \"acc_norm\": 0.6560283687943262,\n \"acc_norm_stderr\": 0.028338017428611334\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6258148631029987,\n\
\ \"acc_stderr\": 0.012359335618172063,\n \"acc_norm\": 0.6258148631029987,\n\
\ \"acc_norm_stderr\": 0.012359335618172063\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8455882352941176,\n \"acc_stderr\": 0.021950024722922026,\n\
\ \"acc_norm\": 0.8455882352941176,\n \"acc_norm_stderr\": 0.021950024722922026\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8333333333333334,\n \"acc_stderr\": 0.015076937921915372,\n \
\ \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.015076937921915372\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n\
\ \"acc_stderr\": 0.04172343038705383,\n \"acc_norm\": 0.7454545454545455,\n\
\ \"acc_norm_stderr\": 0.04172343038705383\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8326530612244898,\n \"acc_stderr\": 0.02389714476891452,\n\
\ \"acc_norm\": 0.8326530612244898,\n \"acc_norm_stderr\": 0.02389714476891452\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n\
\ \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n\
\ \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.95,\n \"acc_stderr\": 0.021904291355759033,\n \
\ \"acc_norm\": 0.95,\n \"acc_norm_stderr\": 0.021904291355759033\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276894,\n\
\ \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276894\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6487148102815178,\n\
\ \"mc1_stderr\": 0.0167113581635444,\n \"mc2\": 0.7450059289802828,\n\
\ \"mc2_stderr\": 0.014438907380750043\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8634569850039463,\n \"acc_stderr\": 0.009650242900291603\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7846853677028052,\n \
\ \"acc_stderr\": 0.011322096294579658\n }\n}\n```"
repo_url: https://huggingface.co/davidkim205/Rhea-72b-v0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|arc:challenge|25_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|gsm8k|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hellaswag|10_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T07-54-02.522577.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T07-54-02.522577.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- '**/details_harness|winogrande|5_2024-03-24T07-54-02.522577.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-24T07-54-02.522577.parquet'
- config_name: results
data_files:
- split: 2024_03_24T07_54_02.522577
path:
- results_2024-03-24T07-54-02.522577.parquet
- split: latest
path:
- results_2024-03-24T07-54-02.522577.parquet
---
# Dataset Card for Evaluation run of davidkim205/Rhea-72b-v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [davidkim205/Rhea-72b-v0.2](https://huggingface.co/davidkim205/Rhea-72b-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_davidkim205__Rhea-72b-v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-24T07:54:02.522577](https://huggingface.co/datasets/open-llm-leaderboard/details_davidkim205__Rhea-72b-v0.2/blob/main/results_2024-03-24T07-54-02.522577.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7806390565791278,
"acc_stderr": 0.027711233842223508,
"acc_norm": 0.7818850286033018,
"acc_norm_stderr": 0.02826655133417084,
"mc1": 0.6487148102815178,
"mc1_stderr": 0.0167113581635444,
"mc2": 0.7450059289802828,
"mc2_stderr": 0.014438907380750043
},
"harness|arc:challenge|25": {
"acc": 0.7568259385665529,
"acc_stderr": 0.012536554144587087,
"acc_norm": 0.7755972696245734,
"acc_norm_stderr": 0.01219140493860383
},
"harness|hellaswag|10": {
"acc": 0.7667795259908384,
"acc_stderr": 0.00422017156927332,
"acc_norm": 0.9083847839075881,
"acc_norm_stderr": 0.0028789243105734504
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.725925925925926,
"acc_stderr": 0.038532548365520045,
"acc_norm": 0.725925925925926,
"acc_norm_stderr": 0.038532548365520045
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.875,
"acc_stderr": 0.026913523521537846,
"acc_norm": 0.875,
"acc_norm_stderr": 0.026913523521537846
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.8,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.8,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8452830188679246,
"acc_stderr": 0.022257075558791282,
"acc_norm": 0.8452830188679246,
"acc_norm_stderr": 0.022257075558791282
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9375,
"acc_stderr": 0.02024219611347799,
"acc_norm": 0.9375,
"acc_norm_stderr": 0.02024219611347799
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7745664739884393,
"acc_stderr": 0.031862098516411454,
"acc_norm": 0.7745664739884393,
"acc_norm_stderr": 0.031862098516411454
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.049665709039785295,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.049665709039785295
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.8212765957446808,
"acc_stderr": 0.02504537327205098,
"acc_norm": 0.8212765957446808,
"acc_norm_stderr": 0.02504537327205098
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7862068965517242,
"acc_stderr": 0.034165204477475494,
"acc_norm": 0.7862068965517242,
"acc_norm_stderr": 0.034165204477475494
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.02306818884826113,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.02306818884826113
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5317460317460317,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.5317460317460317,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8806451612903226,
"acc_stderr": 0.018443411325315413,
"acc_norm": 0.8806451612903226,
"acc_norm_stderr": 0.018443411325315413
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6748768472906403,
"acc_stderr": 0.032957975663112704,
"acc_norm": 0.6748768472906403,
"acc_norm_stderr": 0.032957975663112704
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781675,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781675
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9343434343434344,
"acc_stderr": 0.01764652667723332,
"acc_norm": 0.9343434343434344,
"acc_norm_stderr": 0.01764652667723332
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9844559585492227,
"acc_stderr": 0.008927492715084315,
"acc_norm": 0.9844559585492227,
"acc_norm_stderr": 0.008927492715084315
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8102564102564103,
"acc_stderr": 0.019880165406588768,
"acc_norm": 0.8102564102564103,
"acc_norm_stderr": 0.019880165406588768
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4925925925925926,
"acc_stderr": 0.030482192395191506,
"acc_norm": 0.4925925925925926,
"acc_norm_stderr": 0.030482192395191506
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8613445378151261,
"acc_stderr": 0.022448264476832576,
"acc_norm": 0.8613445378151261,
"acc_norm_stderr": 0.022448264476832576
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5894039735099338,
"acc_stderr": 0.04016689594849929,
"acc_norm": 0.5894039735099338,
"acc_norm_stderr": 0.04016689594849929
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9339449541284404,
"acc_stderr": 0.01064913148785894,
"acc_norm": 0.9339449541284404,
"acc_norm_stderr": 0.01064913148785894
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.03099866630456053,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.03099866630456053
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658928,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658928
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9071729957805907,
"acc_stderr": 0.018889750550956715,
"acc_norm": 0.9071729957805907,
"acc_norm_stderr": 0.018889750550956715
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8071748878923767,
"acc_stderr": 0.026478240960489365,
"acc_norm": 0.8071748878923767,
"acc_norm_stderr": 0.026478240960489365
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8473282442748091,
"acc_stderr": 0.0315452167200547,
"acc_norm": 0.8473282442748091,
"acc_norm_stderr": 0.0315452167200547
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.028268812192540616,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.028268812192540616
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.03520703990517962,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.03520703990517962
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8711656441717791,
"acc_stderr": 0.026321383198783656,
"acc_norm": 0.8711656441717791,
"acc_norm_stderr": 0.026321383198783656
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6607142857142857,
"acc_stderr": 0.044939490686135404,
"acc_norm": 0.6607142857142857,
"acc_norm_stderr": 0.044939490686135404
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.03393295729761011,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.03393295729761011
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9444444444444444,
"acc_stderr": 0.015006312806446912,
"acc_norm": 0.9444444444444444,
"acc_norm_stderr": 0.015006312806446912
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263734,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263734
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9208173690932312,
"acc_stderr": 0.009656024044324226,
"acc_norm": 0.9208173690932312,
"acc_norm_stderr": 0.009656024044324226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8439306358381503,
"acc_stderr": 0.019539014685374036,
"acc_norm": 0.8439306358381503,
"acc_norm_stderr": 0.019539014685374036
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.8011173184357542,
"acc_stderr": 0.013349892983092517,
"acc_norm": 0.8011173184357542,
"acc_norm_stderr": 0.013349892983092517
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8594771241830066,
"acc_stderr": 0.01989943546353996,
"acc_norm": 0.8594771241830066,
"acc_norm_stderr": 0.01989943546353996
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8585209003215434,
"acc_stderr": 0.019794326658090555,
"acc_norm": 0.8585209003215434,
"acc_norm_stderr": 0.019794326658090555
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8734567901234568,
"acc_stderr": 0.018498600558790906,
"acc_norm": 0.8734567901234568,
"acc_norm_stderr": 0.018498600558790906
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6560283687943262,
"acc_stderr": 0.028338017428611334,
"acc_norm": 0.6560283687943262,
"acc_norm_stderr": 0.028338017428611334
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6258148631029987,
"acc_stderr": 0.012359335618172063,
"acc_norm": 0.6258148631029987,
"acc_norm_stderr": 0.012359335618172063
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8455882352941176,
"acc_stderr": 0.021950024722922026,
"acc_norm": 0.8455882352941176,
"acc_norm_stderr": 0.021950024722922026
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.015076937921915372,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.015076937921915372
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.04172343038705383,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.04172343038705383
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8326530612244898,
"acc_stderr": 0.02389714476891452,
"acc_norm": 0.8326530612244898,
"acc_norm_stderr": 0.02389714476891452
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.95,
"acc_stderr": 0.021904291355759033,
"acc_norm": 0.95,
"acc_norm_stderr": 0.021904291355759033
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.025679342723276894,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.025679342723276894
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6487148102815178,
"mc1_stderr": 0.0167113581635444,
"mc2": 0.7450059289802828,
"mc2_stderr": 0.014438907380750043
},
"harness|winogrande|5": {
"acc": 0.8634569850039463,
"acc_stderr": 0.009650242900291603
},
"harness|gsm8k|5": {
"acc": 0.7846853677028052,
"acc_stderr": 0.011322096294579658
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
kaleemWaheed/twitter_dataset_1713039093 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 21469
num_examples: 49
download_size: 12837
dataset_size: 21469
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nccratliri/vad-animals | ---
license: apache-2.0
---
# Positive Transfer Of The Whisper Speech Transformer To Human And Animal Voice Activity Detection
We proposed WhisperSeg, utilizing the Whisper Transformer pre-trained for Automatic Speech Recognition (ASR) for both human and animal Voice Activity Detection (VAD). For more details, please refer to our paper
>
> [**Positive Transfer of the Whisper Speech Transformer to Human and Animal Voice Activity Detection**](https://doi.org/10.1101/2023.09.30.560270)
>
> Nianlong Gu, Kanghwi Lee, Maris Basha, Sumit Kumar Ram, Guanghao You, Richard H. R. Hahnloser <br>
> University of Zurich and ETH Zurich
This animals dataset was customized Animal Voice Activity Detection (vocal segmentation) when training the WhisperSeg segmenter.
## Download Dataset
```python
from huggingface_hub import snapshot_download
snapshot_download('nccratliri/vad-animals', local_dir = "data/vad-animals", repo_type="dataset" )
```
For more usage details, please refer to the GitHub repository: https://github.com/nianlonggu/WhisperSeg
## Citation
When using this dataset for your work, please cite:
```
@article {Gu2023.09.30.560270,
author = {Nianlong Gu and Kanghwi Lee and Maris Basha and Sumit Kumar Ram and Guanghao You and Richard Hahnloser},
title = {Positive Transfer of the Whisper Speech Transformer to Human and Animal Voice Activity Detection},
elocation-id = {2023.09.30.560270},
year = {2023},
doi = {10.1101/2023.09.30.560270},
publisher = {Cold Spring Harbor Laboratory},
abstract = {This paper introduces WhisperSeg, utilizing the Whisper Transformer pre-trained for Automatic Speech Recognition (ASR) for human and animal Voice Activity Detection (VAD). Contrary to traditional methods that detect human voice or animal vocalizations from a short audio frame and rely on careful threshold selection, WhisperSeg processes entire spectrograms of long audio and generates plain text representations of onset, offset, and type of voice activity. Processing a longer audio context with a larger network greatly improves detection accuracy from few labeled examples. We further demonstrate a positive transfer of detection performance to new animal species, making our approach viable in the data-scarce multi-species setting.Competing Interest StatementThe authors have declared no competing interest.},
URL = {https://www.biorxiv.org/content/early/2023/10/02/2023.09.30.560270},
eprint = {https://www.biorxiv.org/content/early/2023/10/02/2023.09.30.560270.full.pdf},
journal = {bioRxiv}
}
```
## Contact
nianlong.gu@uzh.ch |
nateraw/rendered-sst2 | ---
annotations_creators:
- machine-generated
language_creators:
- crowdsourced
language:
- en
license:
- unknown
multilinguality:
- monolingual
pretty_name: Rendered SST-2
size_categories:
- 1K<n<10K
source_datasets:
- extended|sst2
task_categories:
- image-classification
task_ids:
- multi-class-image-classification
---
# Rendered SST-2
The [Rendered SST-2 Dataset](https://github.com/openai/CLIP/blob/main/data/rendered-sst2.md) from Open AI.
Rendered SST2 is an image classification dataset used to evaluate the models capability on optical character recognition. This dataset was generated by rendering sentences in the Standford Sentiment Treebank v2 dataset.
This dataset contains two classes (positive and negative) and is divided in three splits: a train split containing 6920 images (3610 positive and 3310 negative), a validation split containing 872 images (444 positive and 428 negative), and a test split containing 1821 images (909 positive and 912 negative). |
kiringodhwani/msp9 | ---
dataset_info:
features:
- name: From
sequence: string
- name: Sent
sequence: string
- name: To
sequence: string
- name: Cc
sequence: string
- name: Subject
sequence: string
- name: Attachment
sequence: string
- name: body
dtype: string
splits:
- name: train
num_bytes: 8204305
num_examples: 6859
download_size: 3436668
dataset_size: 8204305
---
# Dataset Card for "msp9"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/popukar_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of popukar/ポプカル/泡普卡 (Arknights)
This is the dataset of popukar/ポプカル/泡普卡 (Arknights), containing 97 images and their tags.
The core tags of this character are `hair_ornament, eyepatch, hat, red_eyes, black_headwear, grey_hair, hairclip, x_hair_ornament, medical_eyepatch, short_hair, rabbit_hair_ornament, bow, braid`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 97 | 117.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/popukar_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 97 | 105.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/popukar_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 216 | 197.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/popukar_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/popukar_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, solo, portrait, simple_background, black_jacket, open_mouth, red_bow, white_background, blush |
| 1 | 6 |  |  |  |  |  | 1girl, black_jacket, hood, long_sleeves, looking_at_viewer, pleated_skirt, red_skirt, solo, white_thighhighs, animal_ear_legwear, closed_mouth, hair_over_one_eye, holding_weapon, animal_ears, bag, black_footwear, full_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | portrait | simple_background | black_jacket | open_mouth | red_bow | white_background | blush | hood | long_sleeves | pleated_skirt | red_skirt | white_thighhighs | animal_ear_legwear | closed_mouth | hair_over_one_eye | holding_weapon | animal_ears | bag | black_footwear | full_body |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-----------|:--------------------|:---------------|:-------------|:----------|:-------------------|:--------|:-------|:---------------|:----------------|:------------|:-------------------|:---------------------|:---------------|:--------------------|:-----------------|:--------------|:------|:-----------------|:------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
irds/beir_quora_dev | ---
pretty_name: '`beir/quora/dev`'
viewer: false
source_datasets: ['irds/beir_quora']
task_categories:
- text-retrieval
---
# Dataset Card for `beir/quora/dev`
The `beir/quora/dev` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/beir#beir/quora/dev).
# Data
This dataset provides:
- `queries` (i.e., topics); count=5,000
- `qrels`: (relevance assessments); count=7,626
- For `docs`, use [`irds/beir_quora`](https://huggingface.co/datasets/irds/beir_quora)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/beir_quora_dev', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/beir_quora_dev', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Thakur2021Beir,
title = "BEIR: A Heterogenous Benchmark for Zero-shot Evaluation of Information Retrieval Models",
author = "Thakur, Nandan and Reimers, Nils and Rücklé, Andreas and Srivastava, Abhishek and Gurevych, Iryna",
journal= "arXiv preprint arXiv:2104.08663",
month = "4",
year = "2021",
url = "https://arxiv.org/abs/2104.08663",
}
```
|
desiai/samachaar | ---
license: odc-by
---
|
stepkurniawan/test | ---
dataset_info:
- config_name: default
features:
- name: pokemon
dtype: string
- name: type
dtype: string
splits:
- name: train
num_bytes: 43
num_examples: 2
download_size: 1215
dataset_size: 43
- config_name: eval_dataframe_test
features:
- name: question
dtype: string
- name: ground_truths
dtype: string
- name: answer
dtype: string
- name: contexts
sequence: string
splits:
- name: train
num_bytes: 16954
num_examples: 5
download_size: 21960
dataset_size: 16954
- config_name: starters
features:
- name: pokemon
dtype: string
- name: type
dtype: string
splits:
- name: train
num_bytes: 65
num_examples: 3
download_size: 0
dataset_size: 65
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- config_name: eval_dataframe_test
data_files:
- split: train
path: eval_dataframe_test/train-*
- config_name: starters
data_files:
- split: train
path: starters/train-*
---
# Dataset Card for "test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlanYky/hate-no-instruction-with-symbol | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 2690166
num_examples: 2000
download_size: 1565347
dataset_size: 2690166
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sheganinans/TickData | ---
license: mit
---
|
autoevaluate/autoeval-staging-eval-project-ben-yu__ms2_combined-823f066f-12515671 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- ben-yu/ms2_combined
eval_info:
task: summarization
model: Blaise-g/long_t5_global_large_pubmed_explanatory
metrics: []
dataset_name: ben-yu/ms2_combined
dataset_config: ben-yu--ms2_combined
dataset_split: train
col_mapping:
text: Abstract
target: Target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: Blaise-g/long_t5_global_large_pubmed_explanatory
* Dataset: ben-yu/ms2_combined
* Config: ben-yu--ms2_combined
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@ben-yu](https://huggingface.co/ben-yu) for evaluating this model. |
SauravMaheshkar/NDC-classes-25 | ---
license: unknown
task_categories:
- graph-ml
tags:
- chemistry
configs:
- config_name: transductive
data_files:
- split: train
path: "processed/transductive/train_df.csv"
- split: valid
path: "processed/transductive/val_df.csv"
- split: test
path: "processed/transductive/test_df.csv"
- config_name: inductive
data_files:
- split: train
path: "processed/inductive/train_df.csv"
- split: valid
path: "processed/inductive/val_df.csv"
- split: test
path: "processed/inductive/test_df.csv"
- config_name: raw
data_files: "raw/*.txt"
---
Source Paper: https://arxiv.org/abs/1802.06916
### Usage
```
from torch_geometric.datasets.cornell import CornellTemporalHyperGraphDataset
dataset = CornellTemporalHyperGraphDataset(root = "./", name="NDC-classes-25", split="train")
```
### Citation
```misc
@article{Benson-2018-simplicial,
author = {Benson, Austin R. and Abebe, Rediet and Schaub, Michael T. and Jadbabaie, Ali and Kleinberg, Jon},
title = {Simplicial closure and higher-order link prediction},
year = {2018},
doi = {10.1073/pnas.1800683115},
publisher = {National Academy of Sciences},
issn = {0027-8424},
journal = {Proceedings of the National Academy of Sciences}
}
```
|
A-Bar/ar-vi_least_cs_train | ---
dataset_info:
features:
- name: query
dtype: string
- name: passage
dtype: string
- name: label
dtype: float64
splits:
- name: train
num_bytes: 7968749
num_examples: 20000
download_size: 3431503
dataset_size: 7968749
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ovior/twitter_dataset_1713101597 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2487437
num_examples: 7405
download_size: 1421547
dataset_size: 2487437
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ramonpzg/latin_music | ---
license: apache-2.0
---
|
bartelds/gos-demo | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
splits:
- name: development
num_bytes: 6030729
num_examples: 59
- name: test
num_bytes: 8229224
num_examples: 71
- name: train
num_bytes: 29128904
num_examples: 300
download_size: 43004020
dataset_size: 43388857
license: cc-by-4.0
task_categories:
- automatic-speech-recognition
language:
- gos
---
# Gronings transcribed speech
Demonstration dataset with Gronings transcribed speech based on the dataset released by [San et al. (2021)](https://github.com/fauxneticien/qbe-std_feats_eval).
For more information see the corresponding [ASRU 2021 paper](https://ieeexplore.ieee.org/abstract/document/9688301). |
bdjordjevic/first-dataset | ---
license: mit
---
|
AdapterOcean/data-standardized_cluster_0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 45007565
num_examples: 4406
download_size: 12691026
dataset_size: 45007565
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data-standardized_cluster_0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tomytjandra/h-and-m-fashion-caption | ---
dataset_info:
features:
- name: text
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 7843224039.084
num_examples: 20491
download_size: 6302088359
dataset_size: 7843224039.084
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "h-and-m-fashion-caption"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mvasiliniuc/iva-swift-codeint | ---
annotations_creators:
- crowdsourced
license: other
language_creators:
- crowdsourced
language:
- code
task_categories:
- text-generation
tags:
- code, swift, native iOS development
size_categories:
- 100K<n<1M
source_datasets: []
pretty_name: iva-swift-codeint-raw
task_ids:
- language-modeling
---
# IVA Swift GitHub Code Dataset
## Dataset Description
This is the raw IVA Swift dataset extracted from GitHub.
It contains uncurated Swift files gathered with the purpose to train a code generation model.
The dataset consists of 753693 swift code files from GitHub totaling ~700MB of data.
The dataset was created from the public GitHub dataset on Google BiqQuery.
### How to use it
To download the full dataset:
```python
from datasets import load_dataset
dataset = load_dataset('mvasiliniuc/iva-swift-codeint', split='train')
```
```python
from datasets import load_dataset
dataset = load_dataset('mvasiliniuc/iva-swift-codeint', split='train')
print(dataset[77723])
#OUTPUT:
{
"repo_name":"simpleandpretty/decider-ios",
"path":"MessagesExtension/MediaResources.swift",
"copies":"1",
"size":"1232",
"content":"import Foundation\nimport UIKit\n\nclass MediaResources {\n\n static func mediaURL(forGameOption option:FightMove) -> URL {\n let bundle = Bundle.main\n guard\n let mediaURL = bundle.url(forResource: option.rawValue, withExtension: \"mp4\")\n ...",
"license":"gpl-3.0"
}
```
## Data Structure
### Data Fields
|Field|Type|Description|
|---|---|---|
|repo_name|string|name of the GitHub repository|
|path|string|path of the file in GitHub repository|
|copies|string|number of occurrences in dataset|
|code|string|content of source file|
|size|string|size of the source file in bytes|
|license|string|license of GitHub repository|
### Instance
```json
{
"repo_name":"simpleandpretty/decider-ios",
"path":"MessagesExtension/MediaResources.swift",
"copies":"1",
"size":"1232",
"content":"import Foundation\nimport UIKit\n\nclass MediaResources {\n\n static func mediaURL(forGameOption option:FightMove) -> URL {\n let bundle = Bundle.main\n guard\n let mediaURL = bundle.url(forResource: option.rawValue, withExtension: \"mp4\")\n ...",
"license":"gpl-3.0"
}
```
## Languages
The dataset contains only Swift files.
```json
{
"Swift": [".swift"]
}
```
## Licenses
Each entry in the dataset contains the associated license. The following is a list of licenses involved and their occurrences.
```json
{
"agpl-3.0": 2775,
"apache-2.0": 180178,
"artistic-2.0": 314,
"bsd-2-clause": 5342,
"bsd-3-clause": 11429,
"cc0-1.0": 2718,
"epl-1.0": 980,
"gpl-2.0": 15751,
"gpl-3.0": 33074,
"isc": 1647,
"lgpl-2.1": 1741,
"lgpl-3.0": 6150,
"mit": 476518,
"mpl-2.0": 11799,
"unlicense": 3277
}
```
## Dataset Statistics
```json
{
"Total size": "~712 MB",
"Number of files": 753693,
"Number of files under 500 bytes": 129827,
"Average file size in bytes": 4245,
}
```
## Dataset Creation
The dataset was created using Google Query for Github:
https://cloud.google.com/blog/topics/public-datasets/github-on-bigquery-analyze-all-the-open-source-code
The following steps were pursued for data
gathering:
1. Creation of a dataset and a table in Google Big Query Project.
2. Creation of a bucket in Google Cloud Storage.
3. Creation of a query in Google Big Query Project.
4. Running the query with the setting to output the results in the dataset and table
created at step one.
5. Exporting the resulting dataset into the bucket created in step 2. Export format of JSON with gzip compression.
The result of these steps leads to the following results:
* 2.7 TB Processed,
* number of extracted rows/Swift files was 464,215
* total logical bytes 1.46 GB.
* The result amounts to 7 json.gz files in a total of 700 MB
The SQL Query used is:
```sql
SELECT
f.repo_name, f.path, c.copies, c.size, c.content, l.license
FROM
(select f.*, row_number() over (partition by id order by path desc) as seqnum from `bigquery-public-data.github_repos.files` AS f) f
JOIN
`bigquery-public-data.github_repos.contents` AS c
ON
f.id = c.id AND seqnum=1
JOIN
`bigquery-public-data.github_repos.licenses` AS l
ON
f.repo_name = l.repo_name
WHERE
NOT c.binary AND ((f.path LIKE '%.swift') AND (c.size BETWEEN 0 AND 1048575))
```
## Data Splits
The dataset only contains a train split.
Using the curated version of this dataset, a split was made into multiple repositories:
* Clean Version: https://huggingface.co/datasets/mvasiliniuc/iva-swift-codeint-clean
* Clean Version Train: https://huggingface.co/datasets/mvasiliniuc/iva-swift-codeint-clean-train
* Clean Version Valid: https://huggingface.co/datasets/mvasiliniuc/iva-swift-codeint-clean-valid
# Considerations for Using the Data
The dataset comprises source code from various repositories, potentially containing harmful or biased code,
along with sensitive information such as passwords or usernames.
# Additional Information
## Dataset Curators
[mircea.dev@icloud.com](mircea.dev@icloud.com)
## Licensing Information
* The license of this open-source dataset is: other.
* The dataset is gathered from open-source repositories on [GitHub using BigQuery](https://cloud.google.com/blog/topics/public-datasets/github-on-bigquery-analyze-all-the-open-source-code).
* Find the license of each entry in the dataset in the corresponding license column.
## Citation Information
```json
@misc {mircea_vasiliniuc_2023,
author = { {Mircea Vasiliniuc} },
title = { iva-swift-codeint (Revision c09ebf8) },
year = 2023,
url = { https://huggingface.co/datasets/mvasiliniuc/iva-swift-codeint },
doi = { 10.57967/hf/0778 },
publisher = { Hugging Face }
}
``` |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.