datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Hemanth-thunder/tawiki | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 810734099
num_examples: 160651
download_size: 265394551
dataset_size: 810734099
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
diffusers-parti-prompts/if-v-1.0 | ---
dataset_info:
features:
- name: Prompt
dtype: string
- name: Category
dtype: string
- name: Challenge
dtype: string
- name: Note
dtype: string
- name: images
dtype: image
- name: model_name
dtype: string
- name: seed
dtype: int64
splits:
- name: train
num_bytes: 166170790.0
num_examples: 1632
download_size: 166034308
dataset_size: 166170790.0
---
# Images of Parti Prompts for "if-v-1.0"
Code that was used to get the results:
```py
from diffusers import DiffusionPipeline
import torch
pipe_low = DiffusionPipeline.from_pretrained("DeepFloyd/IF-I-XL-v1.0", safety_checker=None, watermarker=None, torch_dtype=torch.float16, variant="fp16")
pipe_low.enable_model_cpu_offload()
pipe_up = DiffusionPipeline.from_pretrained("DeepFloyd/IF-II-L-v1.0", safety_checker=None, watermarker=None, text_encoder=pipe_low.text_encoder, torch_dtype=torch.float16, variant="fp16")
pipe_up.enable_model_cpu_offload()
prompt = "" # a parti prompt
generator = torch.Generator("cuda").manual_seed(0)
prompt_embeds, negative_prompt_embeds = pipe_low.encode_prompt(prompt)
images = pipe_low(prompt_embeds=prompt_embeds, negative_prompt_embeds=negative_prompt_embeds, num_inference_steps=100, generator=generator, output_type="pt").images
images = pipe_up(prompt_embeds=prompt_embeds, negative_prompt_embeds=negative_prompt_embeds, image=images, num_inference_steps=100, generator=generator).images[0]
``` |
Praveen777/llama_guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966694
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama_guanaco-llama2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rajendrabaskota/progan-train-dataset | ---
dataset_info:
features:
- name: file_path
dtype: string
- name: label
dtype: int64
- name: img_embed
dtype: string
splits:
- name: train1
num_bytes: 1369595157
num_examples: 80000
- name: train2
num_bytes: 684791443
num_examples: 40000
- name: train3
num_bytes: 1369569743
num_examples: 80000
- name: train4
num_bytes: 1369568608
num_examples: 80000
- name: train5
num_bytes: 684783753
num_examples: 40000
download_size: 3914461145
dataset_size: 5478308704
configs:
- config_name: default
data_files:
- split: train1
path: data/train1-*
- split: train2
path: data/train2-*
- split: train3
path: data/train3-*
- split: train4
path: data/train4-*
- split: train5
path: data/train5-*
---
|
susnatak/Bengali-healthcare | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 84833527
num_examples: 47531
download_size: 30445724
dataset_size: 84833527
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Vinisf/djguina | ---
license: openrail
---
|
theLeaf1/mangesh_99_Llama2_Format | ---
dataset_info:
features:
- name: Llama2_Format
dtype: string
splits:
- name: train
num_bytes: 27675
num_examples: 99
download_size: 11765
dataset_size: 27675
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
atulpandey/custum_conll2003 | ---
license: openrail
---
|
ramadhani/ragas-subject-test-01 | ---
license: apache-2.0
---
|
Ryan1122/multiturn_cn_18k | ---
task_categories:
- conversational
language:
- zh
tags:
- multiturn
- self-instruct
- CN
size_categories:
- 10K<n<100K
license: cc-by-nc-4.0
---
# Dataset Card for Dataset Name
Will update soon! |
fathyshalab/massive_qa-de-DE | ---
dataset_info:
features:
- name: id
dtype: string
- name: locale
dtype: string
- name: partition
dtype: string
- name: scenario
dtype:
class_label:
names:
'0': social
'1': transport
'2': calendar
'3': play
'4': news
'5': datetime
'6': recommendation
'7': email
'8': iot
'9': general
'10': audio
'11': lists
'12': qa
'13': cooking
'14': takeaway
'15': music
'16': alarm
'17': weather
- name: intent
dtype:
class_label:
names:
'0': datetime_query
'1': iot_hue_lightchange
'2': transport_ticket
'3': takeaway_query
'4': qa_stock
'5': general_greet
'6': recommendation_events
'7': music_dislikeness
'8': iot_wemo_off
'9': cooking_recipe
'10': qa_currency
'11': transport_traffic
'12': general_quirky
'13': weather_query
'14': audio_volume_up
'15': email_addcontact
'16': takeaway_order
'17': email_querycontact
'18': iot_hue_lightup
'19': recommendation_locations
'20': play_audiobook
'21': lists_createoradd
'22': news_query
'23': alarm_query
'24': iot_wemo_on
'25': general_joke
'26': qa_definition
'27': social_query
'28': music_settings
'29': audio_volume_other
'30': calendar_remove
'31': iot_hue_lightdim
'32': calendar_query
'33': email_sendemail
'34': iot_cleaning
'35': audio_volume_down
'36': play_radio
'37': cooking_query
'38': datetime_convert
'39': qa_maths
'40': iot_hue_lightoff
'41': iot_hue_lighton
'42': transport_query
'43': music_likeness
'44': email_query
'45': play_music
'46': audio_volume_mute
'47': social_post
'48': alarm_set
'49': qa_factoid
'50': calendar_set
'51': play_game
'52': alarm_remove
'53': lists_remove
'54': transport_taxi
'55': recommendation_movies
'56': iot_coffee
'57': music_query
'58': play_podcasts
'59': lists_query
- name: text
dtype: string
- name: annot_utt
dtype: string
- name: worker_id
dtype: string
- name: slot_method
sequence:
- name: slot
dtype: string
- name: method
dtype: string
- name: judgments
sequence:
- name: worker_id
dtype: string
- name: intent_score
dtype: int8
- name: slots_score
dtype: int8
- name: grammar_score
dtype: int8
- name: spelling_score
dtype: int8
- name: language_identification
dtype: string
- name: label_name
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 329537
num_examples: 1183
- name: validation
num_bytes: 59481
num_examples: 214
- name: test
num_bytes: 79960
num_examples: 288
download_size: 141433
dataset_size: 468978
---
# Dataset Card for "massive_qa-de-DE"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lili98/tarja1 | ---
license: openrail
---
|
Asap7772/elix_latent_preferences_gpt4 | ---
dataset_info:
features:
- name: yw
dtype: string
- name: yl
dtype: string
- name: x
dtype: string
- name: level
dtype: string
splits:
- name: train
num_bytes: 161699226.61892328
num_examples: 78412
- name: test
num_bytes: 19941227.381076723
num_examples: 9670
download_size: 4633749
dataset_size: 181640454.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_azarafrooz__Mistral-7B-Instruct-v2-sp-v0.1 | ---
pretty_name: Evaluation run of azarafrooz/Mistral-7B-Instruct-v2-sp-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [azarafrooz/Mistral-7B-Instruct-v2-sp-v0.1](https://huggingface.co/azarafrooz/Mistral-7B-Instruct-v2-sp-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_azarafrooz__Mistral-7B-Instruct-v2-sp-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T00:57:08.636734](https://huggingface.co/datasets/open-llm-leaderboard/details_azarafrooz__Mistral-7B-Instruct-v2-sp-v0.1/blob/main/results_2024-03-10T00-57-08.636734.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.607447095635537,\n\
\ \"acc_stderr\": 0.03314052014839398,\n \"acc_norm\": 0.6119347527420224,\n\
\ \"acc_norm_stderr\": 0.033811338894945774,\n \"mc1\": 0.5287637698898409,\n\
\ \"mc1_stderr\": 0.017474513848525518,\n \"mc2\": 0.6822484423368418,\n\
\ \"mc2_stderr\": 0.015197767693951841\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5887372013651877,\n \"acc_stderr\": 0.014379441068522085,\n\
\ \"acc_norm\": 0.6305460750853242,\n \"acc_norm_stderr\": 0.014104578366491888\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6681935869348735,\n\
\ \"acc_stderr\": 0.004698995789478832,\n \"acc_norm\": 0.8484365664210317,\n\
\ \"acc_norm_stderr\": 0.003578643387547847\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.02872750295788027,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.02872750295788027\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n\
\ \"acc_stderr\": 0.03758517775404948,\n \"acc_norm\": 0.5838150289017341,\n\
\ \"acc_norm_stderr\": 0.03758517775404948\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.04615186962583703,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.04615186962583703\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n\
\ \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.373015873015873,\n \"acc_stderr\": 0.02490699045899257,\n \"acc_norm\"\
: 0.373015873015873,\n \"acc_norm_stderr\": 0.02490699045899257\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.635483870967742,\n\
\ \"acc_stderr\": 0.027379871229943245,\n \"acc_norm\": 0.635483870967742,\n\
\ \"acc_norm_stderr\": 0.027379871229943245\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198896,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198896\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306443,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5564102564102564,\n \"acc_stderr\": 0.025189149894764205,\n\
\ \"acc_norm\": 0.5564102564102564,\n \"acc_norm_stderr\": 0.025189149894764205\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.02813325257881563,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.02813325257881563\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7908256880733945,\n \"acc_stderr\": 0.017437937173343233,\n \"\
acc_norm\": 0.7908256880733945,\n \"acc_norm_stderr\": 0.017437937173343233\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145624,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145624\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\
\ \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n\
\ \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597552,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597552\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n\
\ \"acc_stderr\": 0.014805384478371155,\n \"acc_norm\": 0.7803320561941252,\n\
\ \"acc_norm_stderr\": 0.014805384478371155\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917205,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917205\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31620111731843575,\n\
\ \"acc_stderr\": 0.015551673652172547,\n \"acc_norm\": 0.31620111731843575,\n\
\ \"acc_norm_stderr\": 0.015551673652172547\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.02671611838015685,\n\
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.02671611838015685\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.02548311560119546,\n\
\ \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.02548311560119546\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43285528031290743,\n\
\ \"acc_stderr\": 0.012654565234622868,\n \"acc_norm\": 0.43285528031290743,\n\
\ \"acc_norm_stderr\": 0.012654565234622868\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6139705882352942,\n \"acc_stderr\": 0.029573269134411124,\n\
\ \"acc_norm\": 0.6139705882352942,\n \"acc_norm_stderr\": 0.029573269134411124\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6339869281045751,\n \"acc_stderr\": 0.019488025745529675,\n \
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.019488025745529675\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\
\ \"acc_stderr\": 0.03115715086935557,\n \"acc_norm\": 0.736318407960199,\n\
\ \"acc_norm_stderr\": 0.03115715086935557\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333047,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333047\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5287637698898409,\n\
\ \"mc1_stderr\": 0.017474513848525518,\n \"mc2\": 0.6822484423368418,\n\
\ \"mc2_stderr\": 0.015197767693951841\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.771112865035517,\n \"acc_stderr\": 0.01180736022402539\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.40106141015921154,\n \
\ \"acc_stderr\": 0.013500158922245542\n }\n}\n```"
repo_url: https://huggingface.co/azarafrooz/Mistral-7B-Instruct-v2-sp-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|arc:challenge|25_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|gsm8k|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hellaswag|10_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|winogrande|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T00-57-08.636734.parquet'
- config_name: results
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- results_2024-03-10T00-57-08.636734.parquet
- split: latest
path:
- results_2024-03-10T00-57-08.636734.parquet
---
# Dataset Card for Evaluation run of azarafrooz/Mistral-7B-Instruct-v2-sp-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [azarafrooz/Mistral-7B-Instruct-v2-sp-v0.1](https://huggingface.co/azarafrooz/Mistral-7B-Instruct-v2-sp-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_azarafrooz__Mistral-7B-Instruct-v2-sp-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T00:57:08.636734](https://huggingface.co/datasets/open-llm-leaderboard/details_azarafrooz__Mistral-7B-Instruct-v2-sp-v0.1/blob/main/results_2024-03-10T00-57-08.636734.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.607447095635537,
"acc_stderr": 0.03314052014839398,
"acc_norm": 0.6119347527420224,
"acc_norm_stderr": 0.033811338894945774,
"mc1": 0.5287637698898409,
"mc1_stderr": 0.017474513848525518,
"mc2": 0.6822484423368418,
"mc2_stderr": 0.015197767693951841
},
"harness|arc:challenge|25": {
"acc": 0.5887372013651877,
"acc_stderr": 0.014379441068522085,
"acc_norm": 0.6305460750853242,
"acc_norm_stderr": 0.014104578366491888
},
"harness|hellaswag|10": {
"acc": 0.6681935869348735,
"acc_stderr": 0.004698995789478832,
"acc_norm": 0.8484365664210317,
"acc_norm_stderr": 0.003578643387547847
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.02872750295788027,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.02872750295788027
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404948,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404948
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.04615186962583703,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.04615186962583703
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.02490699045899257,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.02490699045899257
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.635483870967742,
"acc_stderr": 0.027379871229943245,
"acc_norm": 0.635483870967742,
"acc_norm_stderr": 0.027379871229943245
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198896,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198896
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306443,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5564102564102564,
"acc_stderr": 0.025189149894764205,
"acc_norm": 0.5564102564102564,
"acc_norm_stderr": 0.025189149894764205
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.02813325257881563,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.02813325257881563
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7908256880733945,
"acc_stderr": 0.017437937173343233,
"acc_norm": 0.7908256880733945,
"acc_norm_stderr": 0.017437937173343233
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145624,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145624
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597552,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597552
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7803320561941252,
"acc_stderr": 0.014805384478371155,
"acc_norm": 0.7803320561941252,
"acc_norm_stderr": 0.014805384478371155
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917205,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917205
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31620111731843575,
"acc_stderr": 0.015551673652172547,
"acc_norm": 0.31620111731843575,
"acc_norm_stderr": 0.015551673652172547
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.02671611838015685,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.02671611838015685
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.02548311560119546,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.02548311560119546
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43285528031290743,
"acc_stderr": 0.012654565234622868,
"acc_norm": 0.43285528031290743,
"acc_norm_stderr": 0.012654565234622868
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6139705882352942,
"acc_stderr": 0.029573269134411124,
"acc_norm": 0.6139705882352942,
"acc_norm_stderr": 0.029573269134411124
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.019488025745529675,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.019488025745529675
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.03115715086935557,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.03115715086935557
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333047,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333047
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5287637698898409,
"mc1_stderr": 0.017474513848525518,
"mc2": 0.6822484423368418,
"mc2_stderr": 0.015197767693951841
},
"harness|winogrande|5": {
"acc": 0.771112865035517,
"acc_stderr": 0.01180736022402539
},
"harness|gsm8k|5": {
"acc": 0.40106141015921154,
"acc_stderr": 0.013500158922245542
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mteb/legal_summarization | ---
language:
- en
multilinguality:
- monolingual
task_categories:
- text-retrieval
source_datasets:
- https://github.com/lauramanor/legal_summarization
task_ids:
- document-retrieval
config_names:
- corpus
tags:
- text-retrieval
dataset_info:
- config_name: default
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: float64
splits:
- name: test
num_examples: 439
- config_name: corpus
features:
- name: _id
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: corpus
num_examples: 438
- config_name: queries
features:
- name: _id
dtype: string
- name: text
dtype: string
splits:
- name: queries
num_examples: 284
configs:
- config_name: default
data_files:
- split: test
path: qrels/test.jsonl
- config_name: corpus
data_files:
- split: corpus
path: corpus.jsonl
- config_name: queries
data_files:
- split: queries
path: queries.jsonl
---
**Legal_summarization**
- Original link: https://github.com/lauramanor/legal_summarization
- The dataset consistes of 439 pairs of contracts and their summarizations from [https://tldrlegal.com](https://tldrlegal.com/) and https://tosdr.org/.
- The query set consists of contract summaries. There are 284 queries.
- The corpus set comprises the contracts. There are 438 contracts in the corpus.
**Usage**
```
import datasets
# Download the dataset
queries = datasets.load_dataset("mteb/legal_summarization", "queries")
documents = datasets.load_dataset("mteb/legal_summarization", "corpus")
pair_labels = datasets.load_dataset("mteb/legal_summarization", "default")
``` |
Aj901842/taciablg | ---
license: openrail
---
|
lumenwrites/gdquest | ---
dataset_info:
features:
- name: path
dtype: string
- name: sentence
dtype: string
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 64598455.82826131
num_examples: 3161
- name: test
num_bytes: 7279448.685738685
num_examples: 352
download_size: 66859575
dataset_size: 71877904.514
---
# Dataset Card for "gdquest"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
152334H/tinystories | ---
license: mit
---
|
kristinashemet/Instruction_Input_dataset_08_04 | ---
dataset_info:
features:
- name: Instruction
dtype: string
- name: Input
dtype: string
splits:
- name: train
num_bytes: 2952786
num_examples: 280
download_size: 152487
dataset_size: 2952786
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
will33am/test_mechanic | ---
dataset_info:
features:
- name: image
dtype: image
- name: filepath
dtype: string
- name: race
dtype:
class_label:
names:
'0': asian
'1': black
'2': caucasian
'3': indian
- name: id
dtype: int64
- name: occupation
dtype:
class_label:
names:
'0': aerospace engineer
'1': automobile engineer
'2': civil engineer
'3': electrical engineer
'4': industrial engineer
'5': mechanic
'6': mechanical engineer
'7': petroleum engineer
- name: clip_tags_LAION_ViT_L_14_2B_ensemble_specific
dtype: string
- name: clip_tags_LAION_ViT_H_14_2B_simple_specific
dtype: string
- name: clip_tags_LAION_ViT_B_32_2B_simple_specific
dtype: string
splits:
- name: test
num_bytes: 462980204.0
num_examples: 4800
download_size: 462626268
dataset_size: 462980204.0
---
# Dataset Card for "test_mechanic"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sagarshf/instruction-tuning-translate | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: src
dtype: string
splits:
- name: train
num_bytes: 2021145
num_examples: 3003
download_size: 680798
dataset_size: 2021145
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE1_17w-r16 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE1_17w-r16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE1_17w-r16](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE1_17w-r16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE1_17w-r16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T08:32:40.202592](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE1_17w-r16/blob/main/results_2023-10-25T08-32-40.202592.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.12458053691275167,\n\
\ \"em_stderr\": 0.00338199412967585,\n \"f1\": 0.17434458892617408,\n\
\ \"f1_stderr\": 0.0034544534531551316,\n \"acc\": 0.4538521744906123,\n\
\ \"acc_stderr\": 0.010558058935343523\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.12458053691275167,\n \"em_stderr\": 0.00338199412967585,\n\
\ \"f1\": 0.17434458892617408,\n \"f1_stderr\": 0.0034544534531551316\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.133434420015163,\n \
\ \"acc_stderr\": 0.009366491609784486\n },\n \"harness|winogrande|5\":\
\ {\n \"acc\": 0.7742699289660616,\n \"acc_stderr\": 0.011749626260902559\n\
\ }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE1_17w-r16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|arc:challenge|25_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T08_32_40.202592
path:
- '**/details_harness|drop|3_2023-10-25T08-32-40.202592.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T08-32-40.202592.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T08_32_40.202592
path:
- '**/details_harness|gsm8k|5_2023-10-25T08-32-40.202592.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T08-32-40.202592.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hellaswag|10_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T18-33-35.889629.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T18-33-35.889629.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T18-33-35.889629.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T08_32_40.202592
path:
- '**/details_harness|winogrande|5_2023-10-25T08-32-40.202592.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T08-32-40.202592.parquet'
- config_name: results
data_files:
- split: 2023_09_11T18_33_35.889629
path:
- results_2023-09-11T18-33-35.889629.parquet
- split: 2023_10_25T08_32_40.202592
path:
- results_2023-10-25T08-32-40.202592.parquet
- split: latest
path:
- results_2023-10-25T08-32-40.202592.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE1_17w-r16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE1_17w-r16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE1_17w-r16](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE1_17w-r16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE1_17w-r16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T08:32:40.202592](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE1_17w-r16/blob/main/results_2023-10-25T08-32-40.202592.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.12458053691275167,
"em_stderr": 0.00338199412967585,
"f1": 0.17434458892617408,
"f1_stderr": 0.0034544534531551316,
"acc": 0.4538521744906123,
"acc_stderr": 0.010558058935343523
},
"harness|drop|3": {
"em": 0.12458053691275167,
"em_stderr": 0.00338199412967585,
"f1": 0.17434458892617408,
"f1_stderr": 0.0034544534531551316
},
"harness|gsm8k|5": {
"acc": 0.133434420015163,
"acc_stderr": 0.009366491609784486
},
"harness|winogrande|5": {
"acc": 0.7742699289660616,
"acc_stderr": 0.011749626260902559
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
its5Q/habr_qna | ---
annotations_creators:
- crowdsourced
language:
- ru
language_creators:
- crowdsourced
license:
- cc0-1.0
multilinguality:
- monolingual
pretty_name: Habr QnA
size_categories:
- 100K<n<1M
source_datasets:
- original
tags: []
task_categories:
- text-generation
- question-answering
task_ids:
- language-modeling
- open-domain-qa
---
# Dataset Card for Habr QnA
## Table of Contents
- [Dataset Card for Habr QnA](#dataset-card-for-habr-qna)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
## Dataset Description
- **Repository:** https://github.com/its5Q/habr-qna-parser
### Dataset Summary
This is a dataset of questions and answers scraped from [Habr QnA](https://qna.habr.com/). There are 723430 asked questions with answers, comments and other metadata.
### Languages
The dataset is mostly Russian with source code in different languages.
## Dataset Structure
### Data Fields
Data fields can be previewed on the dataset card page.
### Data Splits
All 723430 examples are in the train split, there is no validation split.
## Dataset Creation
The data was scraped with a script, located in [my GitHub repository](https://github.com/its5Q/habr-qna-parser)
## Additional Information
### Dataset Curators
- https://github.com/its5Q |
pccl-org/formal-logic-simple-order-multi-token-fixed-objects-paired-relationship-0-10000 | ---
dataset_info:
features:
- name: greater_than
sequence: int64
- name: less_than
sequence: int64
- name: paired_example
sequence:
sequence:
sequence: int64
- name: correct_example
sequence:
sequence: int64
- name: incorrect_example
sequence:
sequence: int64
- name: distance
dtype: int64
- name: index
dtype: int64
- name: index_in_distance
dtype: int64
splits:
- name: train
num_bytes: 1419533112
num_examples: 4865250
download_size: 337732830
dataset_size: 1419533112
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ricahrd/VDTC | ---
license: openrail
---
|
bz-arc13/wild_chat_en_zh_dedup_v2 | ---
dataset_info:
features:
- name: conversation
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: Chinese
num_bytes: 345442221
num_examples: 87121
- name: English
num_bytes: 1256575247
num_examples: 233247
download_size: 793178102
dataset_size: 1602017468
configs:
- config_name: default
data_files:
- split: Chinese
path: data/Chinese-*
- split: English
path: data/English-*
---
|
Mireu-Lab/CIC-IDS | ---
task_categories:
- feature-extraction
tags:
- code
---
# CIC-IDS
This dataset is a dataset that sorts multiple tracks that are attacked by the network.
The data on that dataset are as follows.
## 자료
The types of Attacks are as follows.
- DDoS
- Web_Attack_�_Brute_Force
- Infiltration
- DoS_GoldenEye
- DoS_Hulk
- Heartbleed
- Bot
- DoS_Slowhttptest
- Web_Attack_�_XSS
- DoS_slowloris
- FTP-Patator
- SSH-Patator
- Web_Attack_�_Sql_Injection
- PortScan
The percentage of attack attempts is as follows.

Detailed Attack Rate Chart
<img src="./image-20230926152655774.png" alt="image-20230926152655774" style="zoom:40%;" />

A dataset made up of .
In addition, the data set is configured with files as follows.
| File Name | the manner of attack | weight of attack (%) |
| ----------------------------------------------------------- | ------------------------------------------------------------ | ------------- |
| Friday-WorkingHours-Afternoon-DDos.pcap_ISCX.csv | DDoS | 56 |
| Tuesday-WorkingHours.pcap_ISCX.csv | FTP-Patator, SSH-Patator | 3 |
| Friday-WorkingHours-Afternoon-PortScan.pcap_ISCX.csv | PortScan | 55 |
| Thursday-WorkingHours-Afternoon-Infilteration.pcap_ISCX.csv | Infiltration | 0.01 |
| Wednesday-workingHours.pcap_ISCX.csv | DoS_Hulk, DoS_Slowhttptest, DoS_GoldenEye, Heartbleed, DoS_slowloris | 36 |
| Friday-WorkingHours-Morning.pcap_ISCX.csv | Bot | 1.02 |
| Thursday-WorkingHours-Morning-WebAttacks.pcap_ISCX.csv | Web_Attack_�_XSS, Web_Attack_�_Brute_Force, Web_Attack_�_Sql_Injection | 1.27 |
- License
The CICIDS2017 dataset consists of labeled network flows, including full packet payloads in pcap format, the corresponding profiles and the labeled flows (GeneratedLabelledFlows.zip) and CSV files for machine and deep learning purpose (MachineLearningCSV.zip) are publicly available for researchers. If you are using our dataset, you should cite our related paper which outlining the details of the dataset and its underlying principles:
Iman Sharafaldin, Arash Habibi Lashkari, and Ali A. Ghorbani, “Toward Generating a New Intrusion Detection Dataset and Intrusion Traffic Characterization”, 4th International Conference on Information Systems Security and Privacy (ICISSP), Portugal, January 2018 |
minimario/apps_partial_sorted_300_end | ---
dataset_info:
features:
- name: problem
dtype: string
- name: code
dtype: string
- name: label
dtype: int64
- name: full_sample
dtype: string
- name: where_from
dtype: string
splits:
- name: train
num_bytes: 1043051462
num_examples: 780933
download_size: 34831859
dataset_size: 1043051462
---
# Dataset Card for "apps_partial_sorted_300_end"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_rombodawg__Open_Gpt4_8x7B_v0.2 | ---
pretty_name: Evaluation run of rombodawg/Open_Gpt4_8x7B_v0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [rombodawg/Open_Gpt4_8x7B_v0.2](https://huggingface.co/rombodawg/Open_Gpt4_8x7B_v0.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rombodawg__Open_Gpt4_8x7B_v0.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-13T18:56:10.033721](https://huggingface.co/datasets/open-llm-leaderboard/details_rombodawg__Open_Gpt4_8x7B_v0.2/blob/main/results_2024-01-13T18-56-10.033721.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7188157275221039,\n\
\ \"acc_stderr\": 0.030029707306740233,\n \"acc_norm\": 0.7225114431475408,\n\
\ \"acc_norm_stderr\": 0.03061684137993921,\n \"mc1\": 0.5605875152998776,\n\
\ \"mc1_stderr\": 0.017374520482513707,\n \"mc2\": 0.7191590734021742,\n\
\ \"mc2_stderr\": 0.014814881257041205\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6646757679180887,\n \"acc_stderr\": 0.01379618294778556,\n\
\ \"acc_norm\": 0.6868600682593856,\n \"acc_norm_stderr\": 0.013552671543623496\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6761601274646485,\n\
\ \"acc_stderr\": 0.0046698341309770785,\n \"acc_norm\": 0.8615813582951604,\n\
\ \"acc_norm_stderr\": 0.0034463307489637123\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.03785714465066653,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.03785714465066653\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8223684210526315,\n \"acc_stderr\": 0.031103182383123377,\n\
\ \"acc_norm\": 0.8223684210526315,\n \"acc_norm_stderr\": 0.031103182383123377\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n\
\ \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7962264150943397,\n \"acc_stderr\": 0.024790784501775406,\n\
\ \"acc_norm\": 0.7962264150943397,\n \"acc_norm_stderr\": 0.024790784501775406\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8402777777777778,\n\
\ \"acc_stderr\": 0.030635578972093288,\n \"acc_norm\": 0.8402777777777778,\n\
\ \"acc_norm_stderr\": 0.030635578972093288\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7514450867052023,\n\
\ \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.7514450867052023,\n\
\ \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5196078431372549,\n \"acc_stderr\": 0.04971358884367405,\n\
\ \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.04971358884367405\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.723404255319149,\n \"acc_stderr\": 0.02924188386962882,\n\
\ \"acc_norm\": 0.723404255319149,\n \"acc_norm_stderr\": 0.02924188386962882\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.631578947368421,\n\
\ \"acc_stderr\": 0.04537815354939391,\n \"acc_norm\": 0.631578947368421,\n\
\ \"acc_norm_stderr\": 0.04537815354939391\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6689655172413793,\n \"acc_stderr\": 0.03921545312467122,\n\
\ \"acc_norm\": 0.6689655172413793,\n \"acc_norm_stderr\": 0.03921545312467122\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5317460317460317,\n \"acc_stderr\": 0.0256993528321318,\n \"acc_norm\"\
: 0.5317460317460317,\n \"acc_norm_stderr\": 0.0256993528321318\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5317460317460317,\n\
\ \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.5317460317460317,\n\
\ \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.864516129032258,\n \"acc_stderr\": 0.01946933458648693,\n \"acc_norm\"\
: 0.864516129032258,\n \"acc_norm_stderr\": 0.01946933458648693\n },\n\
\ \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6305418719211823,\n\
\ \"acc_stderr\": 0.03395970381998574,\n \"acc_norm\": 0.6305418719211823,\n\
\ \"acc_norm_stderr\": 0.03395970381998574\n },\n \"harness|hendrycksTest-high_school_computer_science|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \
\ \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n\
\ \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.02406315641682253,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.02406315641682253\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240524,\n\
\ \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240524\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7205128205128205,\n \"acc_stderr\": 0.022752388839776823,\n\
\ \"acc_norm\": 0.7205128205128205,\n \"acc_norm_stderr\": 0.022752388839776823\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.02938162072646507,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.02938162072646507\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8109243697478992,\n \"acc_stderr\": 0.025435119438105364,\n\
\ \"acc_norm\": 0.8109243697478992,\n \"acc_norm_stderr\": 0.025435119438105364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"\
acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8990825688073395,\n \"acc_stderr\": 0.012914673545364432,\n \"\
acc_norm\": 0.8990825688073395,\n \"acc_norm_stderr\": 0.012914673545364432\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5972222222222222,\n \"acc_stderr\": 0.03344887382997865,\n \"\
acc_norm\": 0.5972222222222222,\n \"acc_norm_stderr\": 0.03344887382997865\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8774509803921569,\n \"acc_stderr\": 0.023015389732458265,\n \"\
acc_norm\": 0.8774509803921569,\n \"acc_norm_stderr\": 0.023015389732458265\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8776371308016878,\n \"acc_stderr\": 0.021331741829746786,\n \
\ \"acc_norm\": 0.8776371308016878,\n \"acc_norm_stderr\": 0.021331741829746786\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7443946188340808,\n\
\ \"acc_stderr\": 0.029275891003969927,\n \"acc_norm\": 0.7443946188340808,\n\
\ \"acc_norm_stderr\": 0.029275891003969927\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8512396694214877,\n \"acc_stderr\": 0.03248470083807194,\n \"\
acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.03248470083807194\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8796296296296297,\n\
\ \"acc_stderr\": 0.031457038543062504,\n \"acc_norm\": 0.8796296296296297,\n\
\ \"acc_norm_stderr\": 0.031457038543062504\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.03157065078911899,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.03157065078911899\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n\
\ \"acc_stderr\": 0.046161430750285455,\n \"acc_norm\": 0.6160714285714286,\n\
\ \"acc_norm_stderr\": 0.046161430750285455\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n\
\ \"acc_stderr\": 0.019875655027867457,\n \"acc_norm\": 0.8974358974358975,\n\
\ \"acc_norm_stderr\": 0.019875655027867457\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8812260536398467,\n\
\ \"acc_stderr\": 0.011569134791715655,\n \"acc_norm\": 0.8812260536398467,\n\
\ \"acc_norm_stderr\": 0.011569134791715655\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7976878612716763,\n \"acc_stderr\": 0.021628077380196124,\n\
\ \"acc_norm\": 0.7976878612716763,\n \"acc_norm_stderr\": 0.021628077380196124\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5687150837988827,\n\
\ \"acc_stderr\": 0.01656382939904771,\n \"acc_norm\": 0.5687150837988827,\n\
\ \"acc_norm_stderr\": 0.01656382939904771\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340866,\n\
\ \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340866\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7942122186495176,\n\
\ \"acc_stderr\": 0.022961339906764244,\n \"acc_norm\": 0.7942122186495176,\n\
\ \"acc_norm_stderr\": 0.022961339906764244\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.019766459563597252,\n\
\ \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.019766459563597252\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5319148936170213,\n \"acc_stderr\": 0.02976667507587387,\n \
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.02976667507587387\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5436766623207301,\n\
\ \"acc_stderr\": 0.012721420501462547,\n \"acc_norm\": 0.5436766623207301,\n\
\ \"acc_norm_stderr\": 0.012721420501462547\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7904411764705882,\n \"acc_stderr\": 0.02472311040767708,\n\
\ \"acc_norm\": 0.7904411764705882,\n \"acc_norm_stderr\": 0.02472311040767708\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7696078431372549,\n \"acc_stderr\": 0.017035229258034038,\n \
\ \"acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.017035229258034038\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n\
\ \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.7272727272727273,\n\
\ \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8040816326530612,\n \"acc_stderr\": 0.025409301953225678,\n\
\ \"acc_norm\": 0.8040816326530612,\n \"acc_norm_stderr\": 0.025409301953225678\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n\
\ \"acc_stderr\": 0.02116621630465939,\n \"acc_norm\": 0.900497512437811,\n\
\ \"acc_norm_stderr\": 0.02116621630465939\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \
\ \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015574,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015574\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5605875152998776,\n\
\ \"mc1_stderr\": 0.017374520482513707,\n \"mc2\": 0.7191590734021742,\n\
\ \"mc2_stderr\": 0.014814881257041205\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8358326756116812,\n \"acc_stderr\": 0.0104108497752228\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5913570887035633,\n \
\ \"acc_stderr\": 0.013540639733342429\n }\n}\n```"
repo_url: https://huggingface.co/rombodawg/Open_Gpt4_8x7B_v0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|arc:challenge|25_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|gsm8k|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hellaswag|10_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T18-56-10.033721.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T18-56-10.033721.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- '**/details_harness|winogrande|5_2024-01-13T18-56-10.033721.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-13T18-56-10.033721.parquet'
- config_name: results
data_files:
- split: 2024_01_13T18_56_10.033721
path:
- results_2024-01-13T18-56-10.033721.parquet
- split: latest
path:
- results_2024-01-13T18-56-10.033721.parquet
---
# Dataset Card for Evaluation run of rombodawg/Open_Gpt4_8x7B_v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [rombodawg/Open_Gpt4_8x7B_v0.2](https://huggingface.co/rombodawg/Open_Gpt4_8x7B_v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_rombodawg__Open_Gpt4_8x7B_v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T18:56:10.033721](https://huggingface.co/datasets/open-llm-leaderboard/details_rombodawg__Open_Gpt4_8x7B_v0.2/blob/main/results_2024-01-13T18-56-10.033721.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7188157275221039,
"acc_stderr": 0.030029707306740233,
"acc_norm": 0.7225114431475408,
"acc_norm_stderr": 0.03061684137993921,
"mc1": 0.5605875152998776,
"mc1_stderr": 0.017374520482513707,
"mc2": 0.7191590734021742,
"mc2_stderr": 0.014814881257041205
},
"harness|arc:challenge|25": {
"acc": 0.6646757679180887,
"acc_stderr": 0.01379618294778556,
"acc_norm": 0.6868600682593856,
"acc_norm_stderr": 0.013552671543623496
},
"harness|hellaswag|10": {
"acc": 0.6761601274646485,
"acc_stderr": 0.0046698341309770785,
"acc_norm": 0.8615813582951604,
"acc_norm_stderr": 0.0034463307489637123
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.03785714465066653,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.03785714465066653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8223684210526315,
"acc_stderr": 0.031103182383123377,
"acc_norm": 0.8223684210526315,
"acc_norm_stderr": 0.031103182383123377
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7962264150943397,
"acc_stderr": 0.024790784501775406,
"acc_norm": 0.7962264150943397,
"acc_norm_stderr": 0.024790784501775406
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8402777777777778,
"acc_stderr": 0.030635578972093288,
"acc_norm": 0.8402777777777778,
"acc_norm_stderr": 0.030635578972093288
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.03295304696818318,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.03295304696818318
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5196078431372549,
"acc_stderr": 0.04971358884367405,
"acc_norm": 0.5196078431372549,
"acc_norm_stderr": 0.04971358884367405
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.723404255319149,
"acc_stderr": 0.02924188386962882,
"acc_norm": 0.723404255319149,
"acc_norm_stderr": 0.02924188386962882
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.04537815354939391,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.04537815354939391
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6689655172413793,
"acc_stderr": 0.03921545312467122,
"acc_norm": 0.6689655172413793,
"acc_norm_stderr": 0.03921545312467122
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5317460317460317,
"acc_stderr": 0.0256993528321318,
"acc_norm": 0.5317460317460317,
"acc_norm_stderr": 0.0256993528321318
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5317460317460317,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.5317460317460317,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.864516129032258,
"acc_stderr": 0.01946933458648693,
"acc_norm": 0.864516129032258,
"acc_norm_stderr": 0.01946933458648693
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6305418719211823,
"acc_stderr": 0.03395970381998574,
"acc_norm": 0.6305418719211823,
"acc_norm_stderr": 0.03395970381998574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.02406315641682253,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.02406315641682253
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.017426974154240524,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.017426974154240524
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7205128205128205,
"acc_stderr": 0.022752388839776823,
"acc_norm": 0.7205128205128205,
"acc_norm_stderr": 0.022752388839776823
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.02938162072646507,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.02938162072646507
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8109243697478992,
"acc_stderr": 0.025435119438105364,
"acc_norm": 0.8109243697478992,
"acc_norm_stderr": 0.025435119438105364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8990825688073395,
"acc_stderr": 0.012914673545364432,
"acc_norm": 0.8990825688073395,
"acc_norm_stderr": 0.012914673545364432
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.03344887382997865,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.03344887382997865
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8774509803921569,
"acc_stderr": 0.023015389732458265,
"acc_norm": 0.8774509803921569,
"acc_norm_stderr": 0.023015389732458265
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8776371308016878,
"acc_stderr": 0.021331741829746786,
"acc_norm": 0.8776371308016878,
"acc_norm_stderr": 0.021331741829746786
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7443946188340808,
"acc_stderr": 0.029275891003969927,
"acc_norm": 0.7443946188340808,
"acc_norm_stderr": 0.029275891003969927
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8512396694214877,
"acc_stderr": 0.03248470083807194,
"acc_norm": 0.8512396694214877,
"acc_norm_stderr": 0.03248470083807194
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8796296296296297,
"acc_stderr": 0.031457038543062504,
"acc_norm": 0.8796296296296297,
"acc_norm_stderr": 0.031457038543062504
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.03157065078911899,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.03157065078911899
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.046161430750285455,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.046161430750285455
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.019875655027867457,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.019875655027867457
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8812260536398467,
"acc_stderr": 0.011569134791715655,
"acc_norm": 0.8812260536398467,
"acc_norm_stderr": 0.011569134791715655
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7976878612716763,
"acc_stderr": 0.021628077380196124,
"acc_norm": 0.7976878612716763,
"acc_norm_stderr": 0.021628077380196124
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5687150837988827,
"acc_stderr": 0.01656382939904771,
"acc_norm": 0.5687150837988827,
"acc_norm_stderr": 0.01656382939904771
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.024170840879340866,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.024170840879340866
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7942122186495176,
"acc_stderr": 0.022961339906764244,
"acc_norm": 0.7942122186495176,
"acc_norm_stderr": 0.022961339906764244
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.019766459563597252,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.019766459563597252
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.02976667507587387,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.02976667507587387
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5436766623207301,
"acc_stderr": 0.012721420501462547,
"acc_norm": 0.5436766623207301,
"acc_norm_stderr": 0.012721420501462547
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7904411764705882,
"acc_stderr": 0.02472311040767708,
"acc_norm": 0.7904411764705882,
"acc_norm_stderr": 0.02472311040767708
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.017035229258034038,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.017035229258034038
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8040816326530612,
"acc_stderr": 0.025409301953225678,
"acc_norm": 0.8040816326530612,
"acc_norm_stderr": 0.025409301953225678
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.02116621630465939,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.02116621630465939
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015574,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015574
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5605875152998776,
"mc1_stderr": 0.017374520482513707,
"mc2": 0.7191590734021742,
"mc2_stderr": 0.014814881257041205
},
"harness|winogrande|5": {
"acc": 0.8358326756116812,
"acc_stderr": 0.0104108497752228
},
"harness|gsm8k|5": {
"acc": 0.5913570887035633,
"acc_stderr": 0.013540639733342429
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
zhangshuoming/ExeBench-Eval-tiny-gpt3.5-result | ---
dataset_info:
features:
- name: c
dtype: string
- name: asm
dtype: string
splits:
- name: train
num_bytes: 48136
num_examples: 100
download_size: 23257
dataset_size: 48136
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ExeBench-Eval-tiny-gpt3.5-result"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GitBag/Reviewer2_PGE_raw | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
size_categories:
- 10K<n<100K
---
# Raw Review Dataset for [Reviewer2](https://arxiv.org/abs/2402.10886)
This is the raw version of our dataset. The cleaned data files that can be directly used for fine-tuning is in [this](https://huggingface.co/datasets/GitBag/Reviewer2_PGE_cleaned) directory.
## Dataset Structure
The folders are structured in the following way:
```
venue
|--venue_year
|--venue_year_metadata
|--venue_year_id1_metadata.json
|--venue_year_id2_metadata.json
...
|--venue_year_paper
|--venue_year_id1_paper.json
|--venue_year_id2_paper.json
...
|--venue_year_review
|--venue_year_id1_review.json
|--venue_year_id2_review.json
...
|--venue_year_pdf
|--venue_year_id1_pdf.pdf
|--venue_year_id2_pdf.pdf
...
```
## Dataset Content
#### Paper Contents
- title: title of the paper
- authors: list of author names
- emails: list of author emails
- sections: list of sections of the paper
- heading: heading of the section
- text: text of the section
- references: list of references of the paper
- title: title of the reference
- author: list of author names of the reference
- venue: venue of the reference
- citeRegEx: citation expression
- shortCiteRegEx: short citation expression
- year: publication year of the reference
- referenceMentions: the location of the reference
in the paper
- referenceID: numerical reference id
- context: context of the reference in the paper
- startOffset: start index of the context
- endOffset: end index of the context
- year: year of publication
- abstractText: abstract of the paper
#### Metadata Contents
- id: unique id of the paper
- conference: venue for the paper
- decision: final decision for the paper (accept/reject)
- url: link to the PDF of the paper
- review_url: link to the review of the paper
- title: title of the paper
- authors: list of the authors of the paper
## Dataset Sources
We incorporate parts of the [PeerRead](https://github.com/allenai/PeerRead) and [NLPeer](https://github.com/UKPLab/nlpeer) datasets along with an update-to-date crawl from ICLR and NeurIPS on [OpenReview](https://openreview.net/) and [NeurIPS Proceedings](http://papers.neurips.cc/).
## Citation
If you find this dataset useful in your research, please cite the following paper:
```
@misc{gao2024reviewer2,
title={Reviewer2: Optimizing Review Generation Through Prompt Generation},
author={Zhaolin Gao and Kianté Brantley and Thorsten Joachims},
year={2024},
eprint={2402.10886},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
JeswinMS4/code_text_classifier | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': code
'1': text
splits:
- name: train
num_bytes: 58725
num_examples: 823
- name: validation
num_bytes: 3311
num_examples: 46
- name: test
num_bytes: 3320
num_examples: 46
download_size: 35195
dataset_size: 65356
---
# Dataset Card for "code_text_classifier"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
datasciathlete/corpus4everyone-klue-small-korean-balance-NER | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': B-PS,
'1': I-PS,
'2': B-OG,
'3': I-OG,
'4': B-LC,
'5': I-LC,
'6': O
splits:
- name: train
num_bytes: 62614599.59708029
num_examples: 48638
- name: validation
num_bytes: 17033235.99117251
num_examples: 12156
download_size: 8225312
dataset_size: 79647835.5882528
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
nielsgl/bayc | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 1467029950.0
num_examples: 10000
download_size: 1463911871
dataset_size: 1467029950.0
---
# Dataset Card for "bayc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
damerajee/khasi-raw-data | ---
license: apache-2.0
---
|
Uilham/Charlinha | ---
license: openrail
---
|
jpacifico/French-Alpaca-dataset-Instruct-110K | ---
license: apache-2.0
language:
- fr
---
110368 French instructions generated by OpenAI GPT-3.5 in Alpaca Format to finetune general models
**Created by Jonathan Pacifico, 2024**
Please credit my name if you use this dataset in your project. |
reckitt-anugrahakbarp/SNS_caption_checker | ---
configs:
- config_name: default
data_files:
- split: train
path: data.csv
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ekolasky/NQLongAnsForLSGSentClassWSTok | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
- name: global_attention_mask
sequence: int64
- name: dataset_index
dtype: int64
splits:
- name: train
num_bytes: 5096195263
num_examples: 58195
download_size: 552583139
dataset_size: 5096195263
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AppleHarem/nagato_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of nagato (Azur Lane)
This is the dataset of nagato (Azur Lane), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 520 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 584 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 520 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 520 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 406 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 584 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 584 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
salehardec/narym-russian | ---
dataset_info:
features:
- name: nar
dtype: string
- name: ru
dtype: string
splits:
- name: train
num_bytes: 344623
num_examples: 5200
download_size: 191243
dataset_size: 344623
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dongyoung4091/hh-generated_flan_t5_rx_xl | ---
dataset_info:
features:
- name: response
dtype: string
- name: prompt
dtype: string
- name: reward_score
dtype: float64
- name: __index_level_0__
dtype: string
splits:
- name: train
num_bytes: 4098047
num_examples: 25600
download_size: 935745
dataset_size: 4098047
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "hh-generated_flan_t5_rx_xl"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pytorch-survival/nwtco | ---
dataset_info:
features:
- name: stage
dtype: int64
- name: age
dtype: float32
- name: in.subcohort
dtype: float32
- name: instit_2
dtype: float32
- name: histol_2
dtype: float32
- name: study_4
dtype: float32
- name: event_time
dtype: float32
- name: event_indicator
dtype: int64
splits:
- name: train
num_bytes: 161120
num_examples: 4028
download_size: 41178
dataset_size: 161120
---
# Dataset Card for "nwtco"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilabel-internal-testing/deita-no-normalization | ---
size_categories: n<1K
config_names:
- deita_filtering
tags:
- synthetic
- distilabel
- rlaif
---
<p align="left">
<a href="https://github.com/argilla-io/distilabel">
<img src="https://raw.githubusercontent.com/argilla-io/distilabel/main/docs/assets/distilabel-badge-light.png" alt="Built with Distilabel" width="200" height="32"/>
</a>
</p>
# Dataset Card for deita-no-normalization
This dataset has been created with [Distilabel](https://distilabel.argilla.io/).
## Dataset Summary
This dataset contains a `pipeline.yaml` which can be used to reproduce the pipeline that generated it in distilabel using the `distilabel` CLI:
```console
distilabel pipeline run --config "https://huggingface.co/datasets/distilabel-internal-testing/deita-no-normalization/raw/main/pipeline.yaml"
```
or explore the configuration:
```console
distilabel pipeline info --config "https://huggingface.co/datasets/distilabel-internal-testing/deita-no-normalization/raw/main/pipeline.yaml"
```
## Dataset structure
The examples have the following structure per configuration:
<details><summary> Configuration: deita_filtering </summary><hr>
```json
{
"answer": "The code snippet provided is correct and does not contain any logical errors.",
"completion": "The original code counts the number of lowercase vowels in a string, but the count should probably be case-insensitive. Here is a case-insensitive version.\r\n\r\nfunction countVowels(str) {\r\n let vowels = [\u0027a\u0027, \u0027e\u0027, \u0027i\u0027, \u0027o\u0027, \u0027u\u0027];\r\n let count = 0;\r\n \r\n for (let i = 0; i \u003c str.length; i++) {\r\n if (vowels.includes(str[i].toLowerCase())) {\r\n count++;\r\n }\r\n }\r\n \r\n return count;\r\n}",
"conversation": [
{
"content": "Can you find and correct any logical errors in the following code snippet and output the corrected code?\r\n\r\nfunction countVowels(str) {\r\n let vowels = [\u0027a\u0027, \u0027e\u0027, \u0027i\u0027, \u0027o\u0027, \u0027u\u0027];\r\n let count = 0;\r\n \r\n for (let i = 0; i \u003c str.length; i++) {\r\n if (vowels.includes(str[i])) {\r\n count++;\r\n }\r\n }\r\n \r\n return count;\r\n}",
"role": "user"
},
{
"content": "The code snippet provided is correct and does not contain any logical errors.",
"role": "assistant"
}
],
"deita_score": 1.0,
"embedding": [
-4.91015625,
4.33203125,
1.9873046875,
-6.66015625,
-3.3671875,
-5.59375,
0.1153564453125,
-2.244140625,
5.04296875,
0.1068115234375,
-2.48828125,
-1.6728515625,
-8.3671875,
-3.8359375,
8.703125,
-2.93359375,
0.29541015625,
-1.927734375,
2.591796875,
-1.3515625,
1.220703125,
0.446533203125,
1.6298828125,
2.84765625,
-2.935546875,
-9.15625,
-4.03125,
2.095703125,
-8.328125,
2.802734375,
-0.65771484375,
3.705078125,
-1.822265625,
4.2265625,
-4.92578125,
-2.234375,
4.421875,
-6.12109375,
5.140625,
-2.115234375,
6.1875,
3.685546875,
3.845703125,
-3.419921875,
4.19921875,
0.38427734375,
1.69140625,
2.755859375,
2.791015625,
0.498291015625,
5.38671875,
6.77734375,
4.390625,
12.4609375,
2.216796875,
-1.265625,
-3.640625,
0.360595703125,
6.72265625,
2.75,
1.3349609375,
-8.046875,
-2.1484375,
-3.537109375,
3.5859375,
-0.281982421875,
1.4970703125,
10.5,
-4.7265625,
-3.658203125,
4.05859375,
2.087890625,
-5.75390625,
-2.53515625,
0.270751953125,
2.9921875,
-2.365234375,
-1.8916015625,
-1.263671875,
-2.486328125,
5.22265625,
1.2529296875,
1.849609375,
-1.81640625,
3.103515625,
-5.08203125,
-5.421875,
0.315185546875,
-3.84375,
1.2685546875,
-4.328125,
-3.396484375,
-1.4970703125,
-4.1640625,
2.146484375,
-2.880859375,
-2.5390625,
5.25390625,
-3.904296875,
8.03125,
9.375,
4.41015625,
9.625,
-4.8046875,
3.087890625,
-8.0,
2.66796875,
4.5703125,
4.828125,
4.25,
-4.1171875,
-0.8466796875,
2.6015625,
-5.328125,
0.0166015625,
1.3369140625,
0.377685546875,
-5.76171875,
0.228759765625,
4.1875,
-4.04296875,
-2.716796875,
-1.8056640625,
-1.5341796875,
6.6796875,
8.6796875,
-11.8828125,
4.375,
1.984375,
-0.775390625,
3.873046875,
1.18359375,
-0.3740234375,
1.6708984375,
-7.36328125,
-3.94921875,
6.171875,
4.421875,
-1.1162109375,
-3.875,
3.1015625,
-1.517578125,
-2.365234375,
-3.849609375,
0.9599609375,
3.419921875,
-0.2459716796875,
-2.484375,
-3.943359375,
3.40234375,
-9.359375,
-8.171875,
-4.1171875,
0.9892578125,
-1.2119140625,
-7.9609375,
9.5859375,
-8.53125,
1.775390625,
-4.07421875,
-0.1368408203125,
-5.70703125,
-1.7041015625,
-2.626953125,
-0.58544921875,
0.927734375,
-5.8984375,
-7.7578125,
-0.19677734375,
-5.9609375,
-5.83203125,
4.2890625,
2.814453125,
-4.31640625,
7.92578125,
-1.68359375,
-6.69140625,
-6.69140625,
4.01171875,
2.57421875,
-2.8046875,
-2.734375,
-2.640625,
-2.505859375,
2.119140625,
3.619140625,
5.91015625,
0.267578125,
-2.45703125,
7.5234375,
2.66015625,
-4.44140625,
-6.3359375,
-0.55517578125,
-2.236328125,
3.7734375,
-1.759765625,
4.32421875,
-2.634765625,
5.49609375,
-1.4052734375,
3.6875,
2.94140625,
3.453125,
4.05078125,
-4.7265625,
0.92138671875,
6.99609375,
-0.433349609375,
-0.93310546875,
3.56640625,
1.0224609375,
6.89453125,
-0.09112548828125,
0.252197265625,
1.2529296875,
6.796875,
1.611328125,
-1.1044921875,
1.890625,
8.0546875,
3.009765625,
-1.1669921875,
3.875,
-6.26171875,
4.6953125,
-2.310546875,
-2.970703125,
-5.59765625,
-5.2734375,
0.88427734375,
-0.7265625,
0.90087890625,
-8.796875,
-3.369140625,
1.484375,
3.146484375,
3.091796875,
-5.2734375,
2.53125,
-8.9453125,
1.6279296875,
-3.275390625,
-4.30859375,
-4.53515625,
-3.130859375,
-0.3427734375,
-7.296875,
-3.298828125,
3.478515625,
-0.479736328125,
-3.484375,
6.359375,
1.14453125,
-3.51953125,
-1.6396484375,
4.4296875,
-8.9375,
10.296875,
0.7939453125,
-0.166748046875,
-2.2890625,
-2.583984375,
-4.8984375,
-3.09375,
-2.01953125,
-2.791015625,
12.359375,
-2.291015625,
-10.171875,
1.248046875,
2.11328125,
-3.455078125,
6.9765625,
-5.29296875,
1.6416015625,
-2.080078125,
7.9765625,
-2.841796875,
5.26953125,
-3.255859375,
3.345703125,
4.50390625,
-0.414306640625,
-0.6171875,
-6.12890625,
1.984375,
1.482421875,
2.451171875,
-4.1171875,
7.390625,
2.875,
-7.38671875,
-2.9609375,
-4.72265625,
1.3935546875,
6.8671875,
6.3828125,
1.078125,
0.45263671875,
7.86328125,
-1.7080078125,
-2.080078125,
-0.27978515625,
6.9296875,
6.76953125,
1.67578125,
-7.1171875,
8.7421875,
-9.4375,
11.3515625,
-6.02734375,
0.8193359375,
-1.1279296875,
1.90234375,
1.6328125,
2.4609375,
-0.151123046875,
-6.21875,
-5.6953125,
-4.71484375,
-1.6181640625,
-11.171875,
11.0625,
7.26953125,
0.78173828125,
-0.64892578125,
-1.328125,
3.669921875,
2.9296875,
3.10546875,
2.86328125,
-0.35595703125,
-4.4609375,
-2.01953125,
-1.05859375,
-0.3037109375,
2.8125,
-4.65625,
-3.24609375,
3.41796875,
-6.3203125,
-6.57421875,
-3.607421875,
3.158203125,
2.029296875,
3.158203125,
7.6171875,
-1.052734375,
-3.73046875,
4.89453125,
-4.03515625,
2.083984375,
-2.265625,
1.466796875,
-4.09765625,
1.806640625,
1.060546875,
-1.46484375,
3.154296875,
0.53125,
5.796875,
0.0093231201171875,
-1.11328125,
-12.2734375,
1.6513671875,
-0.76220703125,
1.6484375,
-1.0927734375,
-2.8984375,
-0.348388671875,
-0.45361328125,
5.046875,
-5.6796875,
2.794921875,
3.982421875,
1.9248046875,
4.21875,
-4.2109375,
5.46484375,
-1.19140625,
0.8037109375,
-2.994140625,
5.03125,
4.62109375,
2.375,
-1.828125,
-0.5498046875,
-3.390625,
1.294921875,
-3.140625,
-1.583984375,
2.533203125,
-3.509765625,
-1.3603515625,
0.85302734375,
-3.75,
-2.912109375,
4.32421875,
2.42578125,
-0.1812744140625,
3.83984375,
-5.4453125,
5.578125,
-6.62109375,
6.3671875,
-3.388671875,
3.421875,
-0.215087890625,
-0.067626953125,
-3.38671875,
0.077392578125,
-4.28515625,
0.9482421875,
-2.833984375,
1.5361328125,
6.1640625,
-0.88232421875,
-9.9609375,
-0.83642578125,
0.40380859375,
-6.625,
3.34765625,
-1.0732421875,
-2.841796875,
4.90625,
-0.394775390625,
-7.39453125,
8.15625,
1.7841796875,
2.048828125,
2.27734375,
6.921875,
1.482421875,
-6.31640625,
-3.654296875,
-1.7900390625,
-2.18359375,
3.0390625,
2.22265625,
-9.546875,
-3.10546875,
4.015625,
-10.0546875,
-0.876953125,
3.15625,
0.457763671875,
0.2059326171875,
-0.10797119140625,
-0.09967041015625,
-0.58544921875,
-4.015625,
4.80859375,
-1.0205078125,
-3.162109375,
11.2265625,
-1.291015625,
6.328125,
-6.3671875,
-7.328125,
5.08203125,
1.3564453125,
1.66015625,
0.7197265625,
19.40625,
-3.484375,
4.875,
-6.09765625,
9.4609375,
5.13671875,
4.15234375,
-2.96484375,
7.66015625,
-4.1953125,
-2.826171875,
-3.37890625,
-1.0703125,
-10.4140625,
-6.625,
5.890625,
-1.0556640625,
-0.703125,
-1.8564453125,
1.3505859375,
8.2890625,
0.06573486328125,
-4.97265625,
-0.443603515625,
-1.298828125,
-7.91015625,
-0.1473388671875,
-0.55419921875,
1.568359375,
-1.1650390625,
13.3828125,
-5.015625,
-2.990234375,
0.65380859375,
2.935546875,
0.86376953125,
-0.048858642578125,
-7.9296875,
5.8828125,
2.818359375,
6.16796875,
-2.84765625,
-1.490234375,
-3.6015625,
-0.1470947265625,
-9.046875,
2.662109375,
3.169921875,
-5.97265625,
10.5625,
8.8125,
-5.7734375,
-8.78125,
-5.109375,
-5.68359375,
3.7578125,
-8.6875,
-9.4609375,
3.078125,
2.587890625,
0.33740234375,
0.2122802734375,
1.275390625,
7.015625,
4.80078125,
1.7158203125,
-0.63232421875,
-9.6328125,
-0.84228515625,
-4.421875,
0.311279296875,
-7.3046875,
-2.529296875,
-6.625,
4.40234375,
-0.288818359375,
-1.4931640625,
-3.7578125,
-1.14453125,
-3.673828125,
3.673828125,
-6.37109375,
2.017578125,
1.6611328125,
5.35546875,
7.4453125,
5.40625,
16.9375,
-60.71875,
-1.5009765625,
2.28515625,
-1.009765625,
-1.4423828125,
6.7421875,
-4.44140625,
5.35546875,
1.09375,
2.02734375,
-3.888671875,
3.05078125,
2.404296875,
1.0048828125,
-0.791015625,
1.2666015625,
-1.9921875,
-2.060546875,
-2.12109375,
-5.02734375,
-0.048583984375,
8.6171875,
5.05078125,
-4.8671875,
-5.96875,
-6.3671875,
9.265625,
-6.41015625,
8.171875,
-2.1015625,
-3.279296875,
1.919921875,
1.326171875,
0.87548828125,
1.5986328125,
3.58203125,
-1.421875,
-2.576171875,
1.169921875,
-6.2421875,
-0.86279296875,
-2.814453125,
1.537109375,
8.421875,
4.36328125,
6.80859375,
-3.517578125,
-1.3662109375,
4.12890625,
-6.9765625,
1.3427734375,
8.546875,
3.654296875,
-5.21484375,
-5.78515625,
0.60107421875,
1.2236328125,
7.98828125,
5.61328125,
-2.41796875,
2.48828125,
-4.35546875,
-6.15234375,
0.34130859375,
-5.87109375,
-3.638671875,
4.3515625,
2.83203125,
-8.046875,
-11.3984375,
-6.421875,
-3.1640625,
6.7109375,
-4.296875,
1.4404296875,
-2.33984375,
0.6279296875,
-3.453125,
-0.85400390625,
4.6171875,
-9.3515625,
2.568359375,
-8.984375,
3.541015625,
1.845703125,
-6.7734375,
4.40234375,
3.666015625,
1.2744140625,
-4.765625,
-3.921875,
-1.220703125,
4.703125,
-6.66796875,
-1.5947265625,
-0.0748291015625,
3.990234375,
6.7109375,
0.379150390625,
1.3935546875,
0.61767578125,
6.3671875,
-0.313720703125,
-3.14453125,
7.046875,
-1.0263671875,
-3.85546875,
-0.2169189453125,
-2.572265625,
-2.283203125,
3.486328125,
7.80078125,
4.69921875,
-2.3984375,
7.328125,
-6.35546875,
9.21875,
-0.6591796875,
3.05859375,
-1.638671875,
-2.525390625,
3.029296875,
2.0859375,
-0.900390625,
-1.6494140625,
-0.01013946533203125,
1.3115234375,
7.8515625,
4.60546875,
-4.58203125,
2.1640625,
2.6796875,
-8.6640625,
6.01953125,
2.458984375,
-4.44921875,
4.5,
-0.87109375,
3.759765625,
-0.2401123046875,
1.3994140625,
2.751953125,
3.341796875,
10.9609375,
-2.88671875,
0.568359375,
4.4140625,
6.9453125,
9.3125,
0.81298828125,
-1.8046875,
-2.02734375,
-4.0078125,
21.84375,
-2.146484375,
0.11083984375,
0.7314453125,
-8.171875,
-1.7470703125,
1.7890625,
0.853515625,
1.3720703125,
0.93017578125,
-0.471435546875,
-7.20703125,
-2.611328125,
0.97607421875,
-4.765625,
5.21875,
-0.321533203125,
8.125,
-2.25,
7.640625,
1.890625,
2.419921875,
1.291015625,
0.1995849609375,
9.5859375,
5.390625,
-0.03826904296875,
-1.0478515625,
-3.708984375,
2.84375,
-6.42578125,
-4.78125,
-6.1328125,
-4.8046875,
6.04296875,
5.5703125,
10.5078125,
-0.01025390625,
-5.77734375,
-4.28125,
-2.44140625,
3.73046875,
5.3125,
0.86962890625,
-1.0849609375,
-3.234375,
3.2890625,
-1.10546875,
3.7734375,
1.375,
5.02734375,
-9.171875,
-1.4833984375,
-3.13671875,
2.890625,
-8.875,
-2.677734375,
-11.6171875,
0.13330078125,
-3.62890625,
1.33984375,
0.165283203125,
5.015625,
-1.5009765625,
-4.8671875,
6.86328125,
-0.92626953125,
-5.953125,
-0.8076171875,
0.1143798828125,
-3.25,
4.39453125,
-16.265625,
-9.46875,
2.779296875,
1.6767578125,
-3.80078125,
1.724609375,
-2.796875,
-4.37890625,
-1.5458984375,
3.376953125,
-3.51953125,
2.12109375,
0.2802734375,
0.93505859375,
-3.765625,
4.578125,
-2.3671875,
5.35546875,
-0.53369140625,
7.59375,
1.3408203125,
2.140625,
4.875,
9.4296875,
6.99609375,
-5.90625,
6.9453125,
2.453125,
3.958984375,
-1.89453125,
0.71484375,
-1.4072265625,
-2.50390625,
-1.1298828125,
-5.72265625,
-7.0078125,
-1.8369140625,
5.67578125,
-5.02734375,
0.5234375,
5.03515625,
0.564453125,
4.28125,
-2.470703125,
-9.3515625,
1.7236328125,
7.0859375,
2.41796875,
2.916015625,
5.46875,
7.328125,
-6.13671875,
10.484375,
1.8193359375,
2.544921875,
6.12109375,
5.703125,
2.80859375,
-2.45703125,
2.287109375,
-5.81640625,
8.6015625,
-3.275390625,
-3.033203125,
-2.01953125,
4.671875,
0.57470703125,
2.45703125,
-7.53515625,
7.41796875,
3.96484375,
-3.5,
-11.46875,
-1.6728515625,
-5.625,
-0.12841796875,
0.051025390625,
3.541015625,
7.2265625,
9.2421875,
1.4140625,
-1.087890625,
-4.76953125,
-3.390625,
-0.1375732421875,
-1.54296875,
-3.740234375,
-4.35546875,
11.2890625,
0.72509765625,
0.5810546875,
4.65625,
-9.453125,
0.31396484375,
-5.33984375,
1.3134765625,
-0.3291015625,
-1.001953125,
-5.625,
5.02734375,
1.615234375,
6.00390625,
-8.6640625,
-0.424560546875,
-5.296875,
0.9853515625,
6.59375,
-3.0078125,
6.59765625,
8.1484375,
1.953125,
6.29296875,
2.810546875,
3.857421875,
-4.7578125,
-8.734375,
-6.796875,
5.125,
1.185546875,
1.5107421875,
-2.95703125,
-0.379638671875,
-0.65869140625,
0.81494140625,
1.681640625,
9.0703125,
-4.56640625,
0.89697265625,
-5.66796875,
2.916015625,
7.34375,
-0.63818359375,
6.10546875,
2.763671875,
-7.015625,
2.18359375,
5.68359375,
-4.64453125,
0.73095703125,
-0.319580078125,
-0.65576171875,
-3.77734375,
-2.365234375,
7.57421875,
0.388671875,
-0.9521484375,
6.3046875,
-0.7607421875,
0.857421875,
-11.9765625,
11.6953125,
-1.6484375,
6.50390625,
-5.828125,
-2.005859375,
-0.6044921875,
3.21875,
6.18359375,
-2.05859375,
-1.6328125,
-0.431640625,
-1.2607421875,
-1.3505859375,
9.296875,
1.8251953125,
9.5859375,
2.892578125,
-1.8779296875,
-3.08203125,
-3.306640625,
-7.9375,
-10.6796875,
3.642578125,
1.7197265625,
-3.125,
-6.77734375,
6.9453125,
2.994140625,
6.6171875,
-1.2353515625,
-4.8046875,
5.92578125,
1.8291015625,
0.0386962890625,
6.76171875,
-1.86328125,
4.890625,
4.953125,
-2.361328125,
4.78125,
4.578125,
2.0078125,
-3.365234375,
3.01953125,
0.62255859375,
-2.33203125,
1.291015625,
-3.419921875,
-0.8486328125,
-1.208984375,
1.0673828125,
-5.109375,
6.9921875,
-4.9921875,
0.91650390625,
-1.708984375,
1.923828125,
0.462890625,
-24.421875,
1.7158203125,
21.1875,
-1.9208984375,
0.1943359375,
-3.388671875,
-0.160400390625,
3.966796875,
0.86083984375,
-0.4580078125,
1.056640625,
2.080078125,
9.4609375,
-4.9765625,
-3.00390625,
1.828125,
1.873046875,
3.849609375,
6.54296875,
-7.40234375,
-6.55859375,
9.2578125,
-3.7890625,
1.830078125,
3.607421875,
1.7705078125,
5.046875,
-8.3046875,
0.464599609375,
0.178466796875,
-0.91943359375,
-3.642578125,
0.08013916015625,
1.830078125,
2.158203125,
-0.55859375,
-0.11767578125,
-8.7890625,
5.55078125,
2.78515625,
2.56640625,
-8.1640625,
4.5546875,
1.525390625,
0.007720947265625,
-13.390625,
1.751953125,
1.21875,
-2.51953125,
0.393310546875,
-1.166015625,
5.984375,
2.837890625,
-3.62890625,
-0.72998046875,
-2.05078125,
4.7578125,
-8.0234375,
-4.0703125,
-0.97802734375,
4.44140625,
3.931640625,
-9.7890625,
-2.255859375,
-2.99609375,
3.857421875,
-2.880859375,
-3.734375,
-8.2578125,
-0.241455078125,
-4.8828125,
7.22265625,
-3.732421875,
-3.048828125,
2.7109375,
1.3173828125,
2.265625,
5.38671875,
-0.9111328125,
-2.078125,
0.6650390625,
3.837890625,
9.6953125,
-1.2529296875,
-5.921875,
-2.259765625,
-3.044921875,
2.630859375,
-0.33740234375,
9.703125,
6.015625,
-1.13671875,
-0.97412109375,
-1.9619140625,
5.421875,
1.6474609375,
-4.30859375,
2.0625,
1.513671875,
-2.490234375,
4.51171875,
9.7265625,
-3.939453125,
1.8662109375,
-0.369873046875,
1.478515625,
-8.7734375,
-9.140625,
0.060150146484375,
2.27734375,
1.1005859375,
-1.9873046875,
-1.0927734375,
-8.8203125,
-1.1015625,
5.046875,
4.859375,
-1.6875,
1.1318359375,
5.00390625,
-3.0625,
1.8447265625,
1.630859375,
-4.8515625,
-0.333984375,
10.2109375,
1.236328125,
3.138671875,
4.24609375,
3.123046875,
-3.03515625,
-12.2734375,
2.30078125,
-1.6083984375,
7.99609375,
1.359375,
-4.85546875,
-5.52734375,
7.52734375,
5.171875,
108.375,
5.24609375,
8.265625,
2.6953125,
-1.0380859375,
-3.12109375,
-4.8515625,
2.05078125,
0.849609375,
-4.3671875,
4.296875,
-1.4609375,
2.53125,
0.46826171875,
0.50537109375,
-1.970703125,
-2.80078125,
-1.3857421875,
11.140625,
1.041015625,
-0.06683349609375,
0.26708984375,
-2.41796875,
0.0491943359375,
-3.212890625,
-8.5703125,
5.0,
-1.015625,
5.19140625,
1.6708984375,
7.49609375,
4.5859375,
0.6787109375,
-1.3291015625,
16.609375,
6.02734375,
5.0703125,
-6.53515625,
5.58203125,
-2.76171875,
4.828125,
-0.48974609375,
13.9453125,
-3.8671875,
1.8486328125,
4.65625,
-2.99609375,
-1.0166015625,
-2.451171875,
4.0625,
3.072265625,
-3.9140625,
2.013671875,
8.953125,
-0.283447265625,
7.15234375,
-7.1484375,
4.796875,
-2.431640625,
-1.232421875,
8.578125,
-4.64453125,
4.59765625,
-1.01953125,
-7.58984375,
-1.7353515625,
-0.50390625,
6.9296875,
-2.66796875,
8.8203125,
4.90625,
11.9453125,
2.212890625,
-1.109375,
-5.53125,
-4.00390625,
2.845703125,
0.9423828125,
1.9404296875,
-2.966796875,
7.13671875,
-4.96875,
-3.470703125,
1.1083984375,
9.4140625,
-6.79296875,
-3.099609375,
3.76171875,
5.76171875,
-2.5546875,
5.703125,
-3.01953125,
6.2890625,
4.94140625,
7.64453125,
4.015625,
-3.013671875,
-3.78515625,
10.484375,
15.625,
-0.40087890625,
2.6171875,
2.65234375,
3.08984375,
-3.123046875,
-3.865234375,
-4.48046875,
-0.11663818359375,
-0.94091796875,
0.0467529296875,
-2.14453125,
-3.46484375,
8.2109375,
6.0078125,
-2.34375,
0.32666015625,
0.7373046875,
-2.998046875,
1.0751953125,
-0.70751953125,
4.484375,
-0.1673583984375,
-2.287109375,
-1.046875,
7.80859375,
2.630859375,
-1.2333984375,
-2.50390625,
0.94140625,
0.751953125,
-4.05859375,
1.060546875,
2.822265625,
1.29296875,
0.62158203125,
3.84765625,
-4.6796875,
-2.2578125,
2.23046875,
-4.41796875,
4.65625,
-3.201171875,
-3.017578125,
-8.0859375,
-2.521484375,
1.9755859375,
8.2265625,
4.421875,
2.599609375,
-5.78125,
-1.9228515625,
1.7822265625,
-3.60546875,
-5.26171875,
5.59375,
-4.02734375,
-5.6640625,
-6.890625,
3.48828125,
2.240234375,
-1.3408203125,
7.890625,
7.98828125,
-5.24609375,
5.078125,
-5.20703125,
3.578125,
-0.2509765625,
0.343994140625,
2.1328125,
-1.0068359375,
1.2451171875,
-6.20703125,
-4.50390625,
-0.51416015625,
-4.2734375,
-4.21484375,
-0.8310546875,
-0.57177734375,
4.98828125,
-0.50341796875,
-3.96875,
0.9775390625,
2.08203125,
-10.0,
0.169677734375,
1.8203125,
-8.2109375,
-3.619140625,
-6.359375,
1.466796875,
2.23046875,
-3.83984375,
-1.958984375,
9.921875,
2.271484375,
-1.6474609375,
5.21484375,
6.25,
-0.88330078125,
1.013671875,
-2.849609375,
-4.75390625,
5.984375,
0.1844482421875,
-1.3095703125,
-11.9921875,
4.00390625,
5.19140625,
3.314453125,
-2.470703125,
-9.25,
4.5859375,
-8.8046875,
-2.779296875,
-4.8125,
1.98828125,
6.828125,
2.720703125,
-7.20703125,
-9.3828125,
2.46875,
0.744140625,
3.99609375,
-1.9033203125,
4.828125,
-6.3984375,
12.7734375,
-2.931640625,
-8.109375,
3.01953125,
4.4375,
-0.450439453125,
-10.8359375,
-1.91796875,
2.001953125,
-1.345703125,
6.28125,
3.84765625,
-2.44140625,
7.31640625,
-3.419921875,
-2.67578125,
1.8134765625,
6.6328125,
8.0859375,
2.791015625,
34.125,
3.685546875,
2.59375,
-70.6875,
-6.28515625,
0.038909912109375,
1.5078125,
-1.380859375,
4.47265625,
-2.919921875,
-2.109375,
-5.66015625,
1.6103515625,
-5.44921875,
-7.6484375,
4.74609375,
8.234375,
0.974609375,
-1.1494140625,
1.3203125,
4.59765625,
-0.3525390625,
-9.03125,
2.62890625,
1.498046875,
5.59765625,
-5.49609375,
-1.2861328125,
6.1640625,
-1.8056640625,
7.48046875,
-2.900390625,
8.9609375,
2.29296875,
7.5078125,
0.181884765625,
-0.055023193359375,
-3.69921875,
-4.73046875,
1.3447265625,
15.4296875,
5.6171875,
-0.497802734375,
1.353515625,
1.1943359375,
-1.59765625,
-6.2265625,
6.5546875,
-0.285888671875,
1.6845703125,
-0.7509765625,
6.4609375,
6.08203125,
-4.99609375,
5.2734375,
-2.859375,
-11.609375,
-3.880859375,
8.375,
-6.0859375,
-5.421875,
-2.33203125,
22.890625,
1.271484375,
-0.84326171875,
-0.9501953125,
5.78515625,
0.10260009765625,
2.08984375,
-4.5078125,
3.177734375,
2.740234375,
-3.90234375,
-7.23828125,
-3.919921875,
6.5078125,
6.0234375,
5.96484375,
-2.1796875,
2.751953125,
2.29296875,
2.447265625,
-2.39453125,
6.70703125,
2.462890625,
8.6796875,
-3.6484375,
3.56640625,
-3.72265625,
-0.2371826171875,
5.234375,
2.673828125,
-5.9765625,
-1.517578125,
6.39453125,
6.72265625,
2.50390625,
3.46484375,
-4.23046875,
0.4033203125,
2.642578125,
1.7158203125,
-2.71484375,
-2.974609375,
-3.5078125,
-7.4140625,
4.45703125,
-1.5703125,
6.578125,
2.173828125,
0.62841796875,
4.64453125,
5.6171875,
6.0,
35.34375,
5.17578125,
-8.1015625,
2.33984375,
-8.890625,
-6.57421875,
-4.17578125,
-4.05859375,
1.4765625,
6.72265625,
2.11328125,
3.607421875,
-2.9921875,
-3.201171875,
14.7734375,
-2.615234375,
5.01953125,
3.1171875,
1.85546875,
4.859375,
1.0537109375,
3.048828125,
-2.869140625,
6.8125,
-12.0546875,
-0.3525390625,
-12.40625,
-11.4921875,
-1.8681640625,
-5.14453125,
-1.423828125,
-0.7392578125,
1.060546875,
-2.626953125,
-3.6484375,
6.296875,
5.37109375,
-0.317626953125,
4.91015625,
-2.31640625,
0.450927734375,
-1.0634765625,
-1.4287109375,
-1.0400390625,
-2.009765625,
1.3134765625,
-8.28125,
7.1796875,
2.21875,
-4.50390625,
-0.447265625,
-3.02734375,
3.857421875,
-2.595703125,
0.58935546875,
-2.994140625,
6.18359375,
3.083984375,
0.1630859375,
-7.83984375,
-0.478759765625,
2.3125,
0.1435546875,
0.3984375,
-5.46484375,
3.244140625,
1.4404296875,
-10.5078125,
9.96875,
2.96875,
-4.79296875,
3.931640625,
-4.8984375,
3.83984375,
-1.3515625,
0.370849609375,
1.3095703125,
-9.9140625,
-1.62109375,
0.8447265625,
5.453125,
-4.5234375,
-0.81640625,
4.0546875,
-2.919921875,
8.0625,
11.6171875,
-6.08984375,
-5.47265625,
-2.74609375,
-2.208984375,
-2.259765625,
16.859375,
-0.82666015625,
11.59375,
4.5625,
1.365234375,
-0.2314453125,
1.2939453125,
3.923828125,
6.49609375,
8.5390625,
0.267333984375,
-6.40234375,
2.05859375,
4.8984375,
2.359375,
0.69140625,
-5.00390625,
-1.421875,
2.146484375,
-2.794921875,
2.0546875,
-9.0390625,
-6.49609375,
3.1171875,
2.595703125,
-4.66796875,
-0.51611328125,
-2.78515625,
-1.7353515625,
0.2705078125,
5.95703125,
6.17578125,
-1.1826171875,
-4.1953125,
-1.7158203125,
-2.65625,
-4.34375,
1.2353515625,
1.490234375,
8.5,
5.87109375,
0.841796875,
-1.650390625,
0.1680908203125,
-2.2109375,
0.7431640625,
-7.578125,
-1.2080078125,
0.92626953125,
-8.09375,
-18.15625,
3.71875,
-1.892578125,
-0.364990234375,
-3.376953125,
2.955078125,
-1.2841796875,
-0.22314453125,
-1.037109375,
-1.4052734375,
7.74609375,
7.73046875,
3.587890625,
10.578125,
-0.48095703125,
2.447265625,
-5.59765625,
3.576171875,
-0.48828125,
0.417724609375,
7.10546875,
-1.6630859375,
3.263671875,
-0.06353759765625,
0.720703125,
-3.3203125,
0.0880126953125,
1.7626953125,
-0.016998291015625,
5.2578125,
4.12109375,
-1.2822265625,
-7.4140625,
-10.421875,
4.875,
0.59423828125,
-5.265625,
0.99853515625,
0.1746826171875,
-0.85107421875,
7.984375,
4.09765625,
-5.4921875,
4.1640625,
-1.40625,
4.11328125,
-7.71875,
2.16796875,
2.197265625,
-4.93359375,
-6.3046875,
-7.1953125,
1.373046875,
2.8125,
1.6015625,
-5.20703125,
-2.95703125,
1.974609375,
1.4833984375,
0.50830078125,
5.78125,
6.93359375,
7.80078125,
-3.63671875,
-0.1910400390625,
11.4921875,
-0.07989501953125,
4.390625,
4.15625,
4.72265625,
-1.7919921875,
6.921875,
-3.0,
-3.765625,
2.07421875,
3.3046875,
-6.38671875,
-3.953125,
-2.09375,
0.441162109375,
3.966796875,
0.50048828125,
-5.21484375,
-7.15625,
-5.52734375,
-7.29296875,
3.478515625,
-6.0546875,
-7.296875,
-1.6337890625,
-7.3046875,
-0.6982421875,
-1.3427734375,
2.51953125,
0.56005859375,
-1.9208984375,
-2.041015625,
3.7578125,
6.21875,
1.1005859375,
1.2666015625,
2.78125,
-15.4140625,
2.048828125,
4.0234375,
0.65185546875,
-1.71875,
1.58984375,
9.671875,
-2.529296875,
1.447265625,
11.0,
2.712890625,
1.9052734375,
3.28515625,
1.4697265625,
-0.4619140625,
1.8046875,
-6.93359375,
2.630859375,
3.513671875,
-3.21875,
4.1796875,
-3.673828125,
-2.095703125,
-2.134765625,
6.0546875,
8.125,
0.32080078125,
4.6953125,
-0.8740234375,
-4.7578125,
-3.994140625,
0.48779296875,
2.9453125,
-3.634765625,
-1.197265625,
-1.46484375,
3.37109375,
0.1942138671875,
-6.00390625,
4.20703125,
-0.30712890625,
1.7978515625,
1.3525390625,
-3.708984375,
-5.6171875,
-1.345703125,
2.376953125,
-0.322265625,
-2.4453125,
-3.90625,
-5.5859375,
-10.2421875,
-1.7880859375,
3.693359375,
1.9248046875,
7.38671875,
2.310546875,
1.515625,
4.07421875,
-0.1441650390625,
-3.021484375,
-1.9716796875,
-0.411376953125,
5.70703125,
-0.1103515625,
5.5234375,
5.66015625,
7.43359375,
4.6953125,
-4.7578125,
-0.857421875,
-4.98828125,
2.294921875,
-0.810546875,
2.181640625,
-2.625,
1.806640625,
-10.5,
-2.115234375,
4.44921875,
5.1484375,
-0.0916748046875,
-4.12109375,
-4.78515625,
-4.1328125,
7.46484375,
4.77734375,
1.595703125,
10.203125,
5.37890625,
-3.861328125,
-2.283203125,
-5.31640625,
9.2109375,
1.5302734375,
6.2109375,
-2.048828125,
-3.4453125,
2.146484375,
0.1695556640625,
1.0068359375,
-0.96435546875,
-2.763671875,
-7.9921875,
1.1494140625,
-0.2386474609375,
2.6640625,
-0.58642578125,
-2.8125,
2.3828125,
11.7265625,
7.1875,
1.400390625,
7.65234375,
-0.035797119140625,
4.01953125,
-4.61328125,
-0.69677734375,
-4.19921875,
-1.09375,
-4.53125,
2.59765625,
-2.611328125,
2.029296875,
-0.1539306640625,
14.8359375,
-1.455078125,
-5.62109375,
-6.05859375,
-6.20703125,
-0.24462890625,
-1.5869140625,
-0.18994140625,
5.30078125,
2.19921875,
1.4677734375,
2.890625,
2.171875,
-6.5703125,
0.291259765625,
-0.5439453125,
-0.216796875,
-1.705078125,
2.861328125,
6.9921875,
-7.26171875,
1.4443359375,
-6.0703125,
-0.50732421875,
-5.19140625,
4.58984375,
3.845703125,
-3.82421875,
3.505859375,
-0.9912109375,
-6.4375,
11.0078125,
2.95703125,
4.61328125,
5.5703125,
6.79296875,
-2.423828125,
-1.8369140625,
5.875,
0.53125,
-5.4375,
-2.666015625,
-0.15087890625,
5.68359375,
6.6953125,
-5.79296875,
6.72265625,
-4.6796875,
5.921875,
4.1171875,
2.2578125,
2.490234375,
-2.41796875,
-2.20703125,
3.916015625,
1.806640625,
-6.47265625,
-0.97314453125,
-4.37109375,
3.03515625,
6.5234375,
1.228515625,
6.20703125,
4.03515625,
4.05078125,
3.640625,
2.255859375,
-3.279296875,
0.07916259765625,
-10.9453125,
8.125,
5.7109375,
3.98828125,
4.62109375,
12.3046875,
-2.904296875,
5.73046875,
-4.00390625,
5.15234375,
-2.48828125,
-1.173828125,
2.91015625,
2.904296875,
2.728515625,
8.3359375,
4.21875,
7.59375,
2.302734375,
7.2890625,
0.409423828125,
-13.1015625,
2.861328125,
-1.6640625,
-2.91015625,
-6.04296875,
7.65234375,
10.125,
7.24609375,
0.68017578125,
1.7646484375,
8.9296875,
15.578125,
-2.923828125,
0.7431640625,
-7.0078125,
1.947265625,
-3.201171875,
4.828125,
-0.2156982421875,
2.212890625,
0.5498046875,
11.0703125,
-3.015625,
1.1982421875,
-4.828125,
-12.4609375,
1.8349609375,
3.21875,
2.8515625,
1.9228515625,
-1.505859375,
5.0625,
-3.44921875,
1.775390625,
-1.390625,
-7.7265625,
-16.96875,
-3.83984375,
-4.8359375,
-4.0625,
5.28125,
5.5390625,
-1.900390625,
2.12109375,
-7.0859375,
-5.89453125,
-0.3896484375,
0.1146240234375,
-1.87109375,
9.3984375,
0.7763671875,
2.7578125,
0.71240234375,
-5.4921875,
7.18359375,
6.4140625,
4.859375,
0.62451171875,
-4.5234375,
-0.032318115234375,
0.96533203125,
-2.677734375,
0.1219482421875,
0.328125,
2.57421875,
1.8310546875,
-2.8671875,
-1.2763671875,
-7.38671875,
-0.58935546875,
-1.4990234375,
9.4609375,
1.9384765625,
0.176025390625,
5.953125,
1.697265625,
0.020843505859375,
-3.3515625,
-67.8125,
2.44921875,
7.0390625,
-3.287109375,
3.67578125,
2.533203125,
0.9873046875,
-1.544921875,
0.0667724609375,
0.419677734375,
-0.1705322265625,
-2.1953125,
1.30078125,
2.1328125,
10.921875,
-3.15625,
4.88671875,
7.8203125,
2.837890625,
-7.59765625,
-0.74462890625,
-9.3828125,
7.5078125,
-3.185546875,
-10.6796875,
9.28125,
-6.98828125,
-0.810546875,
8.2578125,
5.7421875,
3.82421875,
4.42578125,
6.0703125,
0.37744140625,
-3.76171875,
-0.71337890625,
7.70703125,
-1.90234375,
0.77099609375,
-7.17578125,
-3.00390625,
-1.3310546875,
4.73828125,
4.09765625,
-1.0048828125,
1.3447265625,
-2.896484375,
1.5595703125,
-10.8984375,
4.08984375,
6.7265625,
0.8955078125,
0.73095703125,
7.15234375,
0.72900390625,
10.546875,
2.228515625,
-6.95703125,
-1.5947265625,
-6.2734375,
-1.662109375,
-0.2110595703125,
-3.06640625,
3.310546875,
-2.98828125,
-1.466796875,
-9.3203125,
2.6796875,
1.2451171875,
1.8720703125,
-9.6328125,
-2.708984375,
3.00390625,
-4.1953125,
-1.2705078125,
6.80078125,
1.646484375,
-0.3798828125,
-5.1484375,
0.491455078125,
8.953125,
8.4765625,
-3.830078125,
-0.0059661865234375,
8.375,
1.44921875,
-4.69921875,
0.78955078125,
-2.927734375,
-0.27001953125,
5.46875,
-7.6796875,
-3.953125,
1.0234375,
0.325439453125,
0.71533203125,
-1.166015625,
-0.71435546875,
-0.50927734375,
0.65966796875,
2.314453125,
-7.0390625,
-5.37109375,
0.40380859375,
2.18359375,
5.0546875,
-2.796875,
4.9453125,
-0.59619140625,
-0.88037109375,
-8.4140625,
-2.140625,
-3.49609375,
-13.2109375,
-1.802734375,
1.9921875,
-0.9072265625,
-5.0,
4.4765625,
2.216796875,
2.34375,
5.6484375,
-9.78125,
1.138671875,
1.71484375,
-2.58203125,
4.65625,
-7.0390625,
-3.33984375,
1.5458984375,
-1.3212890625,
-3.947265625,
-6.60546875,
3.35546875,
4.28515625,
-1.298828125,
-2.83984375,
-0.2037353515625,
-3.109375,
-2.966796875,
3.7109375,
3.509765625,
4.68359375,
7.91015625,
-1.5302734375,
-3.0546875,
-1.986328125,
-3.484375,
-3.7421875,
-4.3984375,
-5.828125,
-1.8173828125,
-7.80078125,
-2.578125,
-1.81640625,
15.609375,
7.3125,
4.484375,
0.1092529296875,
10.234375,
5.58984375,
-6.7265625,
0.361083984375,
-1.955078125,
4.21875,
2.212890625,
-9.5703125,
4.62109375,
0.91748046875,
4.01171875,
0.58544921875,
3.966796875,
3.49609375,
-0.9775390625,
-2.12890625,
6.73046875,
-10.21875,
-4.74609375,
1.4345703125,
-12.109375,
-11.1953125,
12.8515625,
3.87109375,
-4.125,
-5.359375,
-1.796875,
-9.1015625,
1.1767578125,
0.448486328125,
0.2115478515625,
3.21484375,
2.69921875,
-1.06640625,
6.18359375,
9.2265625,
-0.1815185546875,
1.3154296875,
-5.125,
-1.0380859375,
2.4765625,
7.8671875,
2.107421875,
4.6796875,
4.21875,
-5.8515625,
-6.56640625,
3.451171875,
-3.556640625,
-2.138671875,
226.125,
-3.013671875,
5.73046875,
1.2275390625,
1.8876953125,
5.93359375,
1.751953125,
-6.07421875,
3.041015625,
6.3359375,
13.4375,
-7.73046875,
-1.8583984375,
-3.259765625,
-6.91796875,
3.732421875,
-1.40234375,
-2.056640625,
-5.02734375,
3.50390625,
7.08984375,
11.390625,
1.775390625,
0.74365234375,
-0.93603515625,
4.125,
-1.0478515625,
-0.494140625,
-5.77734375,
5.85546875,
8.71875,
1.138671875,
5.23046875,
-1.201171875,
-7.76953125,
2.771484375,
-3.78125,
-1.5703125,
7.00390625,
-0.038421630859375,
6.82421875,
3.11328125,
3.01171875,
-0.63037109375,
-5.9921875,
3.12890625,
4.8984375,
1.1552734375,
-2.572265625,
5.30078125,
2.330078125,
2.0078125,
0.6669921875,
-2.3671875,
-3.677734375,
2.900390625,
-1.9794921875,
3.26171875,
-1.287109375,
-3.345703125,
-6.59375,
-1.4462890625,
4.3515625,
0.90087890625,
-2.8828125,
3.447265625,
0.78857421875,
3.478515625,
2.24609375,
0.1895751953125,
-4.921875,
0.312744140625,
7.390625,
10.078125,
2.6171875,
-2.990234375,
-5.15625,
-3.7265625,
7.01953125,
2.302734375,
-5.953125,
-6.265625,
-1.3916015625,
4.12890625,
8.578125,
5.8125,
-3.22265625,
-3.291015625,
1.0322265625,
0.349365234375,
2.658203125,
2.41015625,
-2.7109375,
-2.091796875,
8.8046875,
1.1328125,
2.255859375,
5.1953125,
0.80029296875,
-1.287109375,
-1.240234375,
4.1953125,
-4.07421875,
-2.330078125,
0.31689453125,
4.14453125,
-5.36328125,
-0.37109375,
-5.34375,
1.2939453125,
-0.059783935546875,
1.4267578125,
-5.7578125,
-0.7763671875,
-2.86328125,
-14.3125,
-5.875,
-0.82568359375,
-3.888671875,
0.40576171875,
-2.927734375,
1.8603515625,
-3.484375,
-2.068359375,
1.478515625,
-0.36572265625,
-3.962890625,
8.5234375,
6.23046875,
-10.6796875,
-0.6904296875,
-8.3671875,
1.767578125,
-2.556640625,
-4.91015625,
-3.73046875,
-8.359375,
1.71875,
-5.83984375,
-2.115234375,
-4.51171875,
5.55859375,
5.328125,
4.3671875,
-1.8623046875,
0.15869140625,
-9.4609375,
0.66259765625,
0.37646484375,
4.52734375,
1.8994140625,
1.70703125,
-0.71728515625,
-4.4609375,
-4.52734375,
-1.3056640625,
-1.8828125,
7.65234375,
-4.875,
4.890625,
-1.0595703125,
1.982421875,
4.40625,
-0.485107421875,
-5.66015625,
-1.1123046875,
1.3154296875,
3.650390625,
-0.98193359375,
1.892578125,
-3.35546875,
-30.609375,
0.06640625,
0.019195556640625,
-7.73046875,
1.0390625,
6.10546875,
4.921875,
5.64453125,
-5.06640625,
-4.39453125,
-12.625,
-4.1484375,
-0.68505859375,
4.15234375,
-8.546875,
-4.1484375,
-1.0830078125,
-2.27734375,
-0.7490234375,
1.94140625,
2.814453125,
-2.537109375,
0.3095703125,
-8.015625,
5.54296875,
5.25,
5.0,
3.138671875,
2.876953125,
-2.197265625,
-1.0966796875,
2.861328125,
-0.342041015625,
-7.90625,
2.619140625,
-8.3359375,
5.265625,
-0.9755859375,
-0.177978515625,
6.77734375,
0.96923828125,
-5.21875,
-1.7373046875,
6.4375,
0.66015625,
0.037322998046875,
1.6005859375,
4.98046875,
-1.8466796875,
-1.2412109375,
2.345703125,
13.4921875,
-1.4951171875,
-3.60546875,
-2.15625,
3.623046875,
1.51171875,
3.3359375,
-1.1162109375,
-0.751953125,
0.5966796875,
-3.50390625,
0.1312255859375,
-0.432373046875,
-3.90625,
-3.318359375,
0.4306640625,
4.5546875,
0.317138671875,
-0.5947265625,
2.529296875,
-0.346923828125,
-1.458984375,
6.06640625,
0.8486328125,
-6.734375,
4.41796875,
-1.44140625,
-5.11328125,
2.33984375,
-2.037109375,
3.19140625,
2.03125,
-6.0546875,
-3.6875,
4.140625,
-6.4453125,
1.2763671875,
6.12890625,
7.48046875,
6.08203125,
2.146484375,
-0.280517578125,
-0.4189453125,
-0.9560546875,
-3.169921875,
-4.55078125,
10.8125,
1.169921875,
-0.9267578125,
-6.37109375,
-5.64453125,
-3.1171875,
-2.8359375,
-4.9609375,
3.6484375,
-5.28125,
1.7744140625,
1.6474609375,
3.62109375,
1.533203125,
4.16015625,
1.0927734375,
0.31103515625,
1.6396484375,
-6.46484375,
-8.6875,
1.37109375,
3.33203125,
-0.38232421875,
-0.36962890625,
-2.990234375,
5.4296875,
-0.432861328125,
2.365234375,
0.397705078125,
0.348388671875,
-3.927734375,
3.021484375,
0.8017578125,
2.375,
6.8125,
-6.6484375,
3.796875,
-0.6904296875,
-0.66650390625,
0.0271148681640625,
-6.8203125,
-3.861328125,
4.5,
5.8046875,
-0.6064453125,
0.359619140625,
8.0078125,
-6.109375,
0.18359375,
-3.43359375,
0.880859375,
-4.7578125,
1.8349609375,
8.4453125,
-9.6875,
3.345703125,
-8.9375,
-3.3984375,
-0.61865234375,
2.41796875,
-11.1484375,
-2.203125,
-2.25,
2.22265625,
-6.3046875,
-3.82421875,
3.060546875,
-8.8125,
-0.69384765625,
-1.5380859375,
-0.5498046875,
-8.6640625,
2.384765625,
-2.0234375,
8.7890625,
1.0625,
-3.966796875,
-2.556640625,
3.4453125,
1.4677734375,
-10.3046875,
-2.3046875,
2.4609375,
-0.318359375,
0.348388671875,
0.83251953125,
-1.55859375,
0.87255859375,
3.19921875,
10.53125,
4.515625,
3.9609375,
6.4140625,
-2.724609375,
3.64453125,
-8.90625,
8.703125,
-2.490234375,
-10.953125,
-1.4853515625,
5.5234375,
7.21484375,
-2.279296875,
0.89892578125,
-0.5703125,
-9.890625,
-3.48828125,
-8.1875,
8.5390625,
3.47265625,
4.72265625,
-0.6240234375,
2.7109375,
7.859375,
-5.64453125,
-4.30078125,
-2.1796875,
-6.00390625,
-5.54296875,
-2.650390625,
-9.046875,
0.053985595703125,
-3.634765625,
0.13671875,
-7.125,
-1.6611328125,
-1.9228515625,
1.955078125,
-2.548828125,
-2.45703125,
-3.712890625,
2.484375,
4.51171875,
0.17578125,
5.859375,
7.24609375,
-3.888671875,
-1.013671875,
3.37890625,
5.2734375,
1.7392578125,
6.4921875,
-1.6962890625,
0.1575927734375,
6.2578125,
9.03125,
5.57421875,
-3.0234375,
0.50146484375,
-1.53515625,
7.859375,
-0.5283203125,
-2.314453125,
5.25,
-1.2978515625,
-2.744140625,
-1.125,
1.5390625,
-6.80859375,
6.3046875,
-7.609375,
7.5625,
-0.736328125,
-4.43359375,
2.25,
-9.1875,
1.673828125,
4.984375,
-10.4296875,
-2.6484375,
1.4833984375,
1.83203125,
-2.642578125,
5.05078125,
1.529296875,
2.7890625,
-5.41796875,
1.2275390625,
-6.1796875,
1.380859375,
-1.84375,
-10.03125,
22.25,
-3.494140625,
6.33203125,
6.51171875,
1.4736328125,
-2.142578125,
1.3203125,
1.2646484375,
9.6484375,
-6.5625,
-0.83154296875,
0.333251953125,
-2.572265625,
-2.5390625,
-7.46875,
-6.65625,
2.7734375,
-3.0859375,
-0.92626953125,
1.1943359375,
-4.875,
-0.09844970703125,
-1.6572265625,
-0.90185546875,
3.595703125,
4.73046875,
13.171875,
6.78125,
-0.482421875,
-0.599609375,
3.38671875,
-6.828125,
0.7724609375,
0.66259765625,
3.15234375,
-5.203125,
-0.1331787109375,
2.890625,
-1.1787109375,
9.7734375,
4.7109375,
-5.63671875,
0.337646484375,
-4.18359375,
3.080078125,
4.1171875,
2.994140625,
-10.359375,
6.796875,
1.0556640625,
1.8330078125,
-0.72705078125,
2.935546875,
8.6640625,
8.359375,
2.955078125,
-2.76171875,
-0.365966796875,
-4.95703125,
-0.2269287109375,
-3.46875,
3.267578125,
-5.41796875,
-11.125,
-0.9677734375,
2.8828125,
3.427734375,
-2.33203125,
1.3115234375,
8.34375,
3.455078125,
4.57421875,
1.3935546875,
1.09375,
-2.74609375,
1.005859375,
3.447265625,
-1.439453125,
-6.3828125,
5.30859375,
-6.6796875,
13.0703125,
-4.47265625,
4.4296875,
-6.1640625,
1.7138671875,
-2.146484375,
3.150390625,
-2.828125,
-5.51953125,
-5.359375,
3.15234375,
5.3125,
7.23046875,
-1.8046875,
0.70361328125,
-5.78125,
-0.3701171875,
4.30859375,
1.9228515625,
5.5390625,
-2.07421875,
-8.2578125,
-6.9140625,
0.016693115234375,
1.3720703125,
8.8828125,
6.34375,
-6.77734375,
2.9453125,
-2.126953125,
-11.25,
2.296875,
6.37109375,
7.921875,
0.52587890625,
-19.5625,
-4.734375,
4.6328125,
-0.7666015625,
-5.62890625,
-3.82421875,
2.435546875,
-1.572265625,
-7.87890625,
-5.95703125,
-1.6015625,
1.8408203125,
-4.74609375,
1.1708984375,
2.376953125,
5.66796875,
1.267578125,
-0.97607421875,
-15.1640625,
-0.269775390625,
0.0989990234375,
-1.119140625,
-5.84765625,
3.298828125,
5.19140625,
-5.375,
-4.78515625,
3.96484375,
-4.3984375,
-3.701171875,
-6.2265625,
-2.041015625,
1.705078125,
-0.671875,
5.00390625,
0.96826171875,
-1.7958984375,
1.5537109375,
8.4609375,
-9.0625,
-2.4453125,
3.189453125,
-0.72607421875,
-6.6875,
2.87890625,
1.794921875,
7.2734375,
-3.189453125,
-3.486328125,
-11.796875,
1.7275390625,
-8.25,
-5.328125,
-3.05859375,
0.15869140625,
1.51171875,
-5.1953125,
1.044921875,
-1.8935546875,
1.93359375,
3.923828125,
6.06640625,
1.087890625,
1.4111328125,
13.9609375,
-3.9296875,
-0.1485595703125,
-4.02734375,
-0.6904296875,
1.720703125,
1.49609375,
-6.7890625,
-5.1640625,
-2.9140625,
-0.306884765625,
0.327392578125,
-0.50048828125,
0.49560546875,
-1.1025390625,
2.755859375,
0.477783203125,
-4.40625,
0.861328125,
-2.224609375,
6.21875,
2.4453125,
-5.0078125,
-2.62890625,
-3.021484375,
-4.765625,
5.7890625,
-5.92578125,
4.40625,
-4.61328125,
0.439453125,
-2.548828125,
-5.3125,
-1.68359375,
0.3955078125,
-3.515625,
1.169921875,
4.94921875,
-6.7890625,
-2.4375,
-0.88720703125,
-0.344970703125,
6.62890625,
1.201171875,
7.74609375,
3.326171875,
-2.734375,
-6.2109375,
-1.9912109375,
0.78271484375,
1.5947265625,
-8.6953125,
-1.755859375,
13.421875,
-4.23828125,
6.12109375,
-9.9609375,
-2.375,
2.52734375,
-0.043731689453125,
2.5078125,
4.48046875,
2.626953125,
-5.75390625,
2.53515625,
7.77734375,
-1.443359375,
-4.87890625,
-8.15625,
-0.9716796875,
-0.8837890625,
-1.3447265625,
-0.61181640625,
1.4365234375,
-5.01171875,
-12.046875,
-4.296875,
-1.4990234375,
-0.24755859375,
2.48046875,
6.3203125,
6.49609375,
-5.59765625,
-7.01171875,
-4.09375,
2.56640625,
7.51171875,
0.8447265625,
3.09765625,
-6.20703125,
-3.01171875,
4.62109375,
-3.57421875,
-9.1953125,
2.9140625,
9.796875,
4.48828125,
-9.3671875,
-10.71875,
-6.640625,
2.9609375,
3.630859375,
1.01171875,
2.5625,
1.6171875,
1.263671875,
-2.064453125,
-3.759765625,
1.1201171875,
-2.19921875,
4.125,
-3.08984375,
4.9765625,
0.69091796875,
-10.859375,
5.66796875,
7.33203125,
4.921875,
1.400390625,
7.09375,
-5.9453125,
-6.08984375,
1.2880859375,
-3.994140625,
-6.4921875,
-0.410400390625,
2.173828125,
3.7890625,
0.97021484375,
-10.8828125,
-1.806640625,
9.0703125,
7.49609375,
3.583984375,
0.86669921875,
1.513671875,
2.33984375,
3.96484375,
-0.1475830078125,
-3.78515625,
7.39453125,
-7.265625,
-1.2431640625,
-1.51953125,
0.115478515625,
-0.7958984375,
-1.2626953125,
-7.10546875,
4.84765625,
1.7568359375,
3.3046875,
-3.7890625,
7.2890625,
-2.740234375,
6.51953125,
-4.1484375,
-3.279296875,
7.1171875,
1.3134765625,
3.373046875,
5.43359375,
-2.353515625,
1.3310546875,
-3.697265625,
4.83984375,
-2.916015625,
-3.703125,
-4.6484375,
-3.669921875,
0.7373046875,
-11.40625,
0.68408203125,
1.2060546875,
-2.521484375,
-6.859375,
3.6328125,
0.73876953125,
-0.75732421875,
-6.015625,
-6.80859375,
-7.41796875,
8.484375,
-4.29296875,
-3.521484375,
4.125,
2.9375,
2.478515625,
-2.966796875,
4.2421875,
-2.484375,
-7.73046875,
-2.134765625,
-4.09375,
-1.8544921875,
-0.041015625,
1.7470703125,
5.37890625,
-2.86328125,
-3.658203125,
0.83740234375,
-13.1875,
1.3193359375,
1.8818359375,
9.0859375,
-7.14453125,
-9.953125,
7.67578125,
-1.3564453125,
2.736328125,
-10.9296875,
-2.8359375,
20.359375,
3.501953125,
3.328125,
-6.42578125,
-8.5234375,
-0.2252197265625,
-4.0,
2.498046875,
2.646484375,
2.185546875,
-0.828125,
0.841796875,
2.697265625,
2.048828125,
-5.57421875,
1.1328125,
-8.09375,
7.80859375,
-1.7373046875,
-10.109375,
4.5078125,
7.20703125,
-1.3203125,
2.55859375,
2.408203125,
-3.107421875,
2.921875,
-1.1025390625,
6.36328125,
-0.44287109375,
-8.578125,
2.6015625,
7.0390625,
-4.6953125,
13.265625,
-14.3671875,
8.1953125,
-8.1328125,
-2.681640625,
0.241943359375,
-4.46484375,
0.66259765625,
3.72265625,
4.86328125,
-3.806640625,
6.28125,
6.2421875,
4.85546875,
12.5078125,
6.74609375,
-9.0,
3.15234375,
5.9921875,
2.271484375,
-5.171875,
3.615234375,
3.412109375,
-0.220947265625,
1.6650390625,
-10.109375,
2.537109375,
-5.34375,
-6.12890625,
4.65625,
-3.25,
-0.046051025390625,
4.12109375,
7.39453125,
-3.515625,
-9.9609375,
3.919921875,
4.4140625,
5.37890625,
11.0703125,
3.58984375,
1.4482421875,
3.544921875,
1.541015625,
1.0595703125,
-1.09765625,
-2.25390625,
3.662109375,
2.109375,
-10.984375,
-1.9443359375,
-2.17578125,
4.734375,
2.021484375,
-2.119140625,
-2.28515625,
4.12890625,
-7.92578125,
2.953125,
-1.9365234375,
5.01953125,
1.4072265625,
1.6318359375,
8.828125,
4.203125,
1.326171875,
9.15625,
-6.609375,
-4.34375,
-7.90625,
3.287109375,
-1.9248046875,
-3.533203125,
-0.411865234375,
-3.384765625,
-0.8603515625,
-5.1171875,
0.8896484375,
-3.501953125,
3.197265625,
0.951171875,
2.09375,
0.30712890625,
-3.08203125,
1.3759765625,
-1.0673828125,
8.9609375,
5.37890625,
5.01953125,
4.109375,
4.40234375,
-3.654296875,
3.099609375,
-2.037109375,
2.205078125,
6.62890625,
-1.515625,
2.6875,
-7.84765625,
-5.453125,
-2.654296875,
-9.75,
-4.12890625,
0.1912841796875,
4.81640625,
-3.62109375,
1.162109375,
-1.12890625,
0.83203125,
-0.135986328125,
-0.70166015625,
0.256591796875,
4.9765625,
-4.40234375,
-3.64453125,
7.5,
-0.307373046875,
3.716796875,
7.1015625,
2.6484375,
0.30126953125,
6.02734375,
-6.44921875,
-6.5859375,
5.25,
2.77734375,
-4.6328125,
4.8125,
2.974609375,
-4.26953125,
0.06719970703125,
-5.921875,
0.1727294921875,
-1.34375,
-4.25,
2.640625,
-4.9453125,
-7.6640625,
-0.66357421875,
-2.44921875,
-2.17578125,
1.7626953125,
2.400390625,
-4.85546875,
-4.09375,
-9.921875,
1.7626953125,
0.350830078125,
2.173828125,
8.8125,
4.8515625,
-2.45703125,
-3.708984375,
-0.83837890625,
-5.453125,
-4.03125,
0.7607421875,
3.271484375,
-2.41015625,
-1.1953125,
-4.3515625,
-7.0546875,
-0.8720703125,
0.619140625,
3.154296875,
5.4453125,
10.59375,
-4.95703125,
10.375,
-6.4765625,
2.015625,
0.71142578125,
-2.3671875,
-5.1484375,
-3.376953125,
-0.0264434814453125,
1.4267578125,
0.1434326171875,
-5.875,
-3.80078125,
-1.6923828125,
-2.37890625,
-4.0078125,
-7.4765625,
-5.796875,
0.1781005859375,
-3.06640625,
-3.91015625,
-9.46875,
-2.19921875,
-4.64453125,
4.07421875,
8.328125,
-3.517578125,
-1.85546875,
-3.40234375,
-6.5703125,
6.95703125,
-0.98193359375,
-0.1280517578125,
-1.8349609375,
-1.4287109375,
-3.3046875,
1.8037109375,
4.203125,
0.8291015625,
-1.1279296875,
-5.5078125,
3.140625,
8.9921875,
-3.810546875,
3.6875,
-3.681640625,
2.1953125,
-2.43359375,
2.66015625,
-6.65234375,
2.376953125,
5.01953125,
3.052734375,
-7.8828125,
2.720703125,
3.78125,
-2.212890625,
3.939453125,
-8.140625,
-4.84375,
3.953125,
-8.2109375,
5.234375,
0.494873046875,
5.45703125,
-8.9921875,
-1.8525390625,
-0.88134765625,
-2.484375,
8.1640625,
6.73046875,
-9.28125,
2.09765625,
9.7109375,
3.2265625,
11.1953125,
16.265625,
0.2464599609375,
1.71875,
-1.9140625,
-7.0234375,
34.21875,
-9.3671875,
3.357421875,
3.21875,
-6.26171875,
-1.9375,
2.25,
4.84765625,
0.9345703125,
6.1328125,
-1.3798828125,
8.5234375,
-2.29296875,
-6.4140625,
-9.4140625,
-1.818359375,
-1.1767578125,
0.8818359375,
5.91015625,
-0.77734375,
-7.14453125,
2.294921875,
3.05859375,
0.0924072265625,
1.431640625,
-3.28125,
-0.849609375,
-3.322265625,
-4.6328125,
0.65625,
3.861328125,
-2.484375,
-1.8359375,
7.640625,
-21.125,
1.9716796875,
1.4794921875,
-0.7099609375,
3.701171875,
-0.90771484375,
3.8671875,
-1.9638671875,
-4.40234375,
-36.96875,
0.238037109375,
-2.2421875,
-0.8720703125,
-1.66015625,
-4.8828125,
-3.1171875,
7.94140625,
-6.05078125,
-1.470703125,
-3.322265625,
2.912109375,
7.9375,
5.28125,
-2.267578125,
-5.05859375,
1.11328125,
1.9033203125,
-1.9404296875,
-5.78125,
1.5625,
-5.62890625,
-2.6875,
-15.0546875,
2.650390625,
2.67578125,
46.9375,
0.7265625,
0.058319091796875,
-0.7705078125,
4.53515625,
-2.951171875,
5.0859375,
3.029296875,
3.30078125,
3.326171875,
-0.857421875,
-1.076171875,
4.4453125,
-2.53125,
-2.59375,
-5.0234375,
3.44140625,
-3.341796875,
9.6953125,
6.609375,
2.93359375,
3.5390625,
-1.421875,
-0.7509765625,
0.1888427734375,
-4.8515625,
-0.38671875,
-0.55517578125,
-5.15234375,
-2.361328125,
-2.087890625,
-0.5771484375,
0.8876953125,
-4.546875,
1.1923828125,
-2.978515625,
2.2734375,
-2.669921875,
0.06756591796875,
1.171875,
4.71484375,
0.419921875,
3.193359375,
-6.921875,
3.451171875,
1.3583984375,
-5.22265625,
-7.03515625,
1.4267578125,
-9.1015625,
6.1796875,
0.88134765625,
-2.603515625,
-2.46875,
-9.1015625,
2.892578125,
-2.607421875,
10.125,
-28.71875,
-6.78125,
1.763671875,
-4.16015625,
-2.21484375,
3.9609375,
-6.4453125,
0.689453125,
4.04296875,
-5.61328125,
-4.1875,
4.83203125,
8.390625,
0.875,
2.126953125,
-2.201171875,
-2.5078125,
-0.06353759765625,
-1.93359375,
-40.03125,
-5.8828125,
3.4609375,
-1.8203125,
9.1171875,
3.171875,
-3.556640625,
-8.390625,
0.26171875,
-1.7724609375,
-5.76171875,
1.986328125,
-10.484375,
-4.79296875,
1.9140625,
-3.416015625,
6.765625,
-4.05859375,
-2.99609375,
-4.71484375,
3.015625,
-8.671875,
2.4140625,
6.0078125,
-0.82568359375,
-4.0859375,
-3.796875,
-0.29345703125,
13.2734375,
1.2109375,
-2.017578125,
-0.56494140625,
1.603515625,
-4.64453125,
-11.34375,
-2.373046875,
0.172607421875,
5.1953125,
0.625,
-1.962890625,
-0.482666015625,
-8.8203125,
6.3203125,
3.98828125,
2.236328125,
-2.748046875,
-9.1953125,
0.1025390625,
2.41015625,
-4.26953125,
-0.431884765625,
-7.2890625,
1.85546875,
-8.25,
-5.71484375,
3.68359375,
3.017578125,
-3.392578125,
-5.47265625,
-3.125,
0.7421875,
2.185546875,
-1.62109375,
13.8203125,
-3.244140625,
-2.052734375,
0.595703125,
-2.1875,
2.228515625,
-1.8388671875,
-1.30078125,
-0.79296875,
-0.40185546875,
5.38671875,
-0.319091796875,
-3.072265625,
1.333984375,
-2.642578125,
3.310546875,
3.5390625,
4.49609375,
-4.15625,
-0.79052734375,
7.578125,
8.0859375,
3.037109375,
-2.830078125,
0.438720703125,
3.421875,
1.6220703125,
-8.9296875,
2.138671875,
-0.302734375,
-4.05859375,
2.083984375,
-0.0638427734375,
7.625,
-3.90234375,
3.359375,
-3.396484375,
2.533203125,
-11.3515625,
-1.7529296875,
-1.1357421875,
-4.91015625,
0.05279541015625,
4.9609375,
-5.92578125,
-1.2685546875,
7.85546875,
-5.62890625,
3.2265625,
2.638671875,
0.325439453125,
2.779296875,
0.26611328125,
0.55615234375,
5.2734375,
-3.919921875,
-2.60546875,
-1.669921875,
-3.6875,
-2.80078125,
1.30078125,
6.359375,
-0.216552734375,
-3.8046875,
4.9609375,
-2.771484375,
-1.20703125,
6.13671875,
0.0716552734375,
0.0102081298828125,
-5.00390625,
-4.73828125,
-4.52734375,
-0.305908203125,
-6.49609375,
5.1171875,
4.9765625,
-3.9609375,
1.0439453125,
1.4853515625,
-0.00217437744140625,
0.05731201171875,
0.99462890625,
-1.7021484375,
-0.55908203125,
-2.486328125,
3.716796875,
3.021484375,
5.3828125,
-1.62109375,
-4.84375,
-4.359375,
-0.50634765625,
0.50537109375,
-5.74609375,
2.595703125,
8.765625,
-4.47265625,
2.88671875,
-1.638671875,
2.8671875,
-2.28125,
10.1484375,
3.041015625,
-3.029296875,
1.7568359375,
2.697265625,
1.6533203125,
-0.1822509765625,
-4.37109375,
3.326171875,
-0.62744140625,
-5.52734375,
-1.845703125,
-0.8916015625,
-1.3505859375,
-5.08984375,
-1.806640625,
-0.36669921875,
-6.9921875,
10.3515625,
-2.14453125,
-1.41796875,
3.529296875,
8.4609375,
-1.3388671875,
-1.0048828125,
2.380859375,
1.4560546875,
-3.107421875,
-4.0625,
-6.11328125,
-0.4404296875,
-8.8515625,
-1.1845703125,
-5.85546875,
-0.8076171875,
14.2109375,
-0.25048828125,
7.91796875,
-9.0546875,
-0.17822265625,
12.4453125,
1.4482421875,
1.82421875,
5.0546875,
6.02734375,
5.28515625,
0.88671875,
3.830078125,
1.3857421875,
-0.53662109375,
1.7314453125,
-1.9111328125,
-0.81494140625,
11.0390625,
7.578125,
-7.86328125,
-4.8125,
5.015625,
-5.921875,
5.23828125,
-1.275390625,
-1.9443359375,
-0.358154296875,
-0.34814453125,
1.83203125,
1.830078125,
1.5654296875,
0.4892578125,
1.3310546875,
-5.5546875,
-7.125,
-5.55859375,
6.8515625,
-1.4072265625,
6.95703125,
4.36328125,
0.438720703125,
-6.13671875,
-5.09375,
1.2216796875,
-3.1171875,
1.390625,
-6.21484375,
-1.984375,
9.546875,
0.459716796875,
6.73828125,
-10.25,
-0.87060546875,
-10.6640625,
-4.03515625,
5.0390625,
4.390625,
-8.2578125,
-5.84375,
0.65966796875,
1.826171875,
-5.984375,
7.47265625,
0.484130859375,
-1.990234375,
1.0986328125,
-5.0859375,
-5.31640625,
4.18359375,
3.38671875,
-9.4921875,
3.0859375,
-0.0447998046875,
-5.796875,
1.431640625,
-11.7890625,
-2.974609375,
2.251953125,
-4.19921875,
8.3203125,
-3.7734375,
0.98974609375,
-1.57421875,
-1.9970703125,
-3.962890625,
-0.130859375,
5.67578125,
8.640625,
6.23046875,
1.19921875,
-4.875,
0.13671875,
-2.83203125,
3.150390625,
-8.375,
-3.349609375,
1.40234375,
4.8828125,
-5.24609375,
-1.9453125,
-0.716796875,
7.80859375,
2.474609375,
-4.11328125,
0.57421875,
-0.0200653076171875,
5.5703125,
6.21875,
6.01171875,
-3.18359375,
-6.203125,
4.328125,
-3.494140625,
-4.09765625,
-4.01171875,
0.5791015625,
-2.689453125,
-9.09375,
-3.841796875,
2.123046875,
-1.84765625,
4.1171875,
-0.75634765625,
3.6484375,
4.6875,
-6.52734375,
-0.312744140625,
2.2578125,
0.11285400390625,
6.62109375,
-6.8203125,
-3.181640625,
-4.86328125,
-3.2734375,
4.21875,
-4.15625,
-4.44921875,
0.6650390625,
16.078125,
-4.65625,
1.5234375,
-2.5390625,
-2.0546875,
3.455078125,
-4.78125,
-0.83203125,
-5.421875,
-3.1328125,
3.888671875,
7.87109375,
2.8203125,
5.33203125,
-8.4921875,
-28.15625,
0.740234375,
-0.9267578125,
-2.04296875,
-3.65625,
2.640625,
3.966796875,
2.58203125,
5.24609375,
-1.0791015625,
-0.91357421875,
-8.078125,
1.48046875,
4.484375,
-2.923828125,
3.048828125,
-10.5234375,
-6.25390625,
5.07421875,
2.583984375,
9.03125,
-0.30615234375,
6.80078125,
3.783203125,
-0.193359375,
2.3046875,
-2.990234375,
1.37890625,
7.5625,
2.333984375,
1.9521484375,
-5.58203125,
2.802734375,
-3.01171875,
2.693359375,
1.27734375,
0.888671875,
-5.35546875,
0.11279296875,
-1.599609375,
-3.330078125,
-1.0439453125,
-0.321533203125,
-2.3046875,
0.90625,
3.28515625,
-2.2351741790771484e-05,
-1.7578125,
-4.20703125,
-3.904296875,
1.0390625,
-4.98046875,
1.4619140625,
4.07421875,
-3.25390625,
-4.859375,
3.2421875,
-1.60546875,
-0.9453125,
-3.109375,
-1.482421875,
1.9345703125,
3.513671875,
1.0634765625,
-2.404296875,
2.87109375,
5.50390625,
0.002635955810546875,
-2.201171875,
5.76171875,
-1.5869140625,
6.796875,
3.69140625,
-5.20703125,
-5.80078125,
-0.87060546875,
3.048828125,
2.54296875,
2.158203125,
2.751953125,
0.008544921875,
-2.0234375,
-5.33203125,
-0.030120849609375,
-0.70068359375,
7.53515625,
10.7109375,
3.796875,
-7.57421875,
-1.4501953125,
4.01171875,
-2.224609375,
-9.3828125,
6.1484375,
2.41796875,
-3.33203125,
-2.3359375,
-7.671875,
-0.6259765625,
4.2734375,
1.3515625,
3.021484375,
-9.6328125,
2.353515625,
-1.6435546875,
1.59375,
1.947265625,
4.7421875,
0.27587890625,
0.85205078125,
14.2109375,
-0.358154296875,
0.415283203125,
-4.05859375,
-0.2435302734375,
-2.646484375,
-0.97314453125,
0.344970703125,
0.58935546875,
5.06640625,
3.5390625,
-2.984375,
-8.9375,
-4.02734375,
6.02734375,
5.4921875,
-1.2177734375,
-2.4765625,
-3.0859375,
-1.9736328125,
-0.444091796875,
-6.1953125,
-2.98828125,
-3.052734375,
6.328125,
1.001953125,
-1.82421875,
7.21484375,
7.0,
8.4375,
3.220703125,
0.33447265625,
1.396484375,
2.517578125,
1.5400390625,
-1.4716796875,
5.10546875,
6.2734375,
-7.41015625,
-2.76953125,
1.1962890625,
1.6611328125,
1.4921875,
5.08203125,
11.6953125,
-0.78564453125,
-4.50390625,
0.8916015625,
0.191650390625,
-1.3740234375,
0.91015625,
17.984375,
-6.75390625,
4.6796875,
0.95751953125,
5.25,
-7.27734375,
-1.46875,
-3.099609375,
-0.37841796875,
0.36669921875,
1.3447265625,
1.408203125,
3.859375,
-3.9765625,
-6.1484375,
-0.72021484375,
1.3271484375,
-4.74609375,
7.26953125,
2.484375,
2.978515625,
9.2109375,
3.330078125,
-3.12890625,
-5.25390625,
0.236328125,
-0.99169921875,
7.48046875,
-1.8994140625,
4.42578125,
-3.078125,
1.5830078125,
8.4921875,
7.5,
-3.306640625,
-2.625,
-0.2081298828125,
-1.943359375,
-4.0390625,
-2.962890625,
-2.37890625,
-0.171630859375,
5.02734375,
-3.63671875,
-0.1453857421875,
-2.55078125,
-3.830078125,
1.755859375,
5.64453125,
3.90625,
-3.009765625,
-1.2109375,
4.7109375,
-2.61328125,
4.3984375,
10.734375,
-2.537109375,
0.3388671875,
-0.054443359375,
-6.3671875,
4.21484375,
-3.8046875,
5.7890625,
0.3251953125,
-0.327880859375,
-6.9765625,
-9.0859375,
10.4375,
-1.994140625,
-11.15625,
1.328125,
3.361328125,
0.20751953125,
4.7890625,
4.39453125,
8.078125,
-10.0,
1.3359375,
-1.22265625,
-2.962890625
],
"evol_instruction_score": 1.0,
"evol_response_score": 1.0,
"evolved_instruction": "Can you find and correct any logical errors in the following code snippet and output the corrected code?\r\n\r\nfunction countVowels(str) {\r\n let vowels = [\u0027a\u0027, \u0027e\u0027, \u0027i\u0027, \u0027o\u0027, \u0027u\u0027];\r\n let count = 0;\r\n \r\n for (let i = 0; i \u003c str.length; i++) {\r\n if (vowels.includes(str[i])) {\r\n count++;\r\n }\r\n }\r\n \r\n return count;\r\n}",
"evolved_response": "The code snippet provided is correct and does not contain any logical errors.",
"meta": {
"category": "Natural language to code",
"completion": "The original code counts the number of lowercase vowels in a string, but the count should probably be case-insensitive. Here is a case-insensitive version.\r\n\r\nfunction countVowels(str) {\r\n let vowels = [\u0027a\u0027, \u0027e\u0027, \u0027i\u0027, \u0027o\u0027, \u0027u\u0027];\r\n let count = 0;\r\n \r\n for (let i = 0; i \u003c str.length; i++) {\r\n if (vowels.includes(str[i].toLowerCase())) {\r\n count++;\r\n }\r\n }\r\n \r\n return count;\r\n}",
"id": 6,
"input": null,
"motivation_app": null,
"prompt": "Can you find and correct any logical errors in the following code snippet and output the corrected code?\r\n\r\nfunction countVowels(str) {\r\n let vowels = [\u0027a\u0027, \u0027e\u0027, \u0027i\u0027, \u0027o\u0027, \u0027u\u0027];\r\n let count = 0;\r\n \r\n for (let i = 0; i \u003c str.length; i++) {\r\n if (vowels.includes(str[i])) {\r\n count++;\r\n }\r\n }\r\n \r\n return count;\r\n}",
"source": "surge",
"subcategory": "Debugging"
},
"model_name": "gpt-3.5-turbo",
"nearest_neighbor_distance": 0.12795357273014518
}
```
This subset can be loaded as:
```python
from datasets import load_dataset
ds = load_dataset("distilabel-internal-testing/deita-no-normalization", "deita_filtering")
```
</details>
|
open-llm-leaderboard/details_vihangd__dopeyshearedplats-1.3b-v1 | ---
pretty_name: Evaluation run of vihangd/dopeyshearedplats-1.3b-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [vihangd/dopeyshearedplats-1.3b-v1](https://huggingface.co/vihangd/dopeyshearedplats-1.3b-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vihangd__dopeyshearedplats-1.3b-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-13T13:37:34.130815](https://huggingface.co/datasets/open-llm-leaderboard/details_vihangd__dopeyshearedplats-1.3b-v1/blob/main/results_2023-12-13T13-37-34.130815.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26012302704770085,\n\
\ \"acc_stderr\": 0.030820336255728206,\n \"acc_norm\": 0.2621303940455793,\n\
\ \"acc_norm_stderr\": 0.031589269063273896,\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.01507721920066259,\n \"mc2\": 0.3821066604136214,\n\
\ \"mc2_stderr\": 0.015269097668070952\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3225255972696246,\n \"acc_stderr\": 0.013659980894277368,\n\
\ \"acc_norm\": 0.3438566552901024,\n \"acc_norm_stderr\": 0.013880644570156215\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4848635729934276,\n\
\ \"acc_stderr\": 0.004987494455523719,\n \"acc_norm\": 0.6430989842660825,\n\
\ \"acc_norm_stderr\": 0.004781061390873926\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2,\n \
\ \"acc_stderr\": 0.034554737023254394,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.034554737023254394\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3223684210526316,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.3223684210526316,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n\
\ \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.25660377358490566,\n \"acc_stderr\": 0.026880647889051958,\n\
\ \"acc_norm\": 0.25660377358490566,\n \"acc_norm_stderr\": 0.026880647889051958\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.037455547914624576,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.037455547914624576\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165044\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483099,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483099\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.04336432707993177,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.04336432707993177\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.23,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.23,\n\
\ \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3404255319148936,\n \"acc_stderr\": 0.03097669299853443,\n\
\ \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.03097669299853443\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.21379310344827587,\n \"acc_stderr\": 0.03416520447747549,\n\
\ \"acc_norm\": 0.21379310344827587,\n \"acc_norm_stderr\": 0.03416520447747549\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643898,\n \"\
acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643898\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\
\ \"acc_stderr\": 0.036196045241242515,\n \"acc_norm\": 0.20634920634920634,\n\
\ \"acc_norm_stderr\": 0.036196045241242515\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.27419354838709675,\n \"acc_stderr\": 0.025378139970885196,\n \"\
acc_norm\": 0.27419354838709675,\n \"acc_norm_stderr\": 0.025378139970885196\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.26108374384236455,\n \"acc_stderr\": 0.030903796952114475,\n \"\
acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.030903796952114475\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.28484848484848485,\n \"acc_stderr\": 0.03524390844511784,\n\
\ \"acc_norm\": 0.28484848484848485,\n \"acc_norm_stderr\": 0.03524390844511784\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.031911782267135466,\n \"\
acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.031911782267135466\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.25906735751295334,\n \"acc_stderr\": 0.03161877917935409,\n\
\ \"acc_norm\": 0.25906735751295334,\n \"acc_norm_stderr\": 0.03161877917935409\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3128205128205128,\n \"acc_stderr\": 0.023507579020645333,\n\
\ \"acc_norm\": 0.3128205128205128,\n \"acc_norm_stderr\": 0.023507579020645333\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844082,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844082\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.226890756302521,\n \"acc_stderr\": 0.027205371538279483,\n \
\ \"acc_norm\": 0.226890756302521,\n \"acc_norm_stderr\": 0.027205371538279483\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389024,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.25871559633027524,\n \"acc_stderr\": 0.01877605231961962,\n \"\
acc_norm\": 0.25871559633027524,\n \"acc_norm_stderr\": 0.01877605231961962\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39351851851851855,\n \"acc_stderr\": 0.03331747876370312,\n \"\
acc_norm\": 0.39351851851851855,\n \"acc_norm_stderr\": 0.03331747876370312\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.29411764705882354,\n \"acc_stderr\": 0.03198001660115071,\n \"\
acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.03198001660115071\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.21518987341772153,\n \"acc_stderr\": 0.026750826994676166,\n \
\ \"acc_norm\": 0.21518987341772153,\n \"acc_norm_stderr\": 0.026750826994676166\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34080717488789236,\n\
\ \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.34080717488789236,\n\
\ \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.03915345408847834,\n\
\ \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.03915345408847834\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2809917355371901,\n \"acc_stderr\": 0.04103203830514512,\n \"\
acc_norm\": 0.2809917355371901,\n \"acc_norm_stderr\": 0.04103203830514512\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.22321428571428573,\n\
\ \"acc_stderr\": 0.039523019677025116,\n \"acc_norm\": 0.22321428571428573,\n\
\ \"acc_norm_stderr\": 0.039523019677025116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n\
\ \"acc_stderr\": 0.028605953702004257,\n \"acc_norm\": 0.2564102564102564,\n\
\ \"acc_norm_stderr\": 0.028605953702004257\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653696,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653696\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2669220945083014,\n\
\ \"acc_stderr\": 0.015818450894777573,\n \"acc_norm\": 0.2669220945083014,\n\
\ \"acc_norm_stderr\": 0.015818450894777573\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.21676300578034682,\n \"acc_stderr\": 0.022183477668412856,\n\
\ \"acc_norm\": 0.21676300578034682,\n \"acc_norm_stderr\": 0.022183477668412856\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2446927374301676,\n\
\ \"acc_stderr\": 0.014378169884098407,\n \"acc_norm\": 0.2446927374301676,\n\
\ \"acc_norm_stderr\": 0.014378169884098407\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.20261437908496732,\n \"acc_stderr\": 0.023015446877985672,\n\
\ \"acc_norm\": 0.20261437908496732,\n \"acc_norm_stderr\": 0.023015446877985672\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.27009646302250806,\n\
\ \"acc_stderr\": 0.025218040373410612,\n \"acc_norm\": 0.27009646302250806,\n\
\ \"acc_norm_stderr\": 0.025218040373410612\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.024659685185967287,\n\
\ \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.024659685185967287\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.28368794326241137,\n \"acc_stderr\": 0.02689170942834396,\n \
\ \"acc_norm\": 0.28368794326241137,\n \"acc_norm_stderr\": 0.02689170942834396\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24967405475880053,\n\
\ \"acc_stderr\": 0.011054538377832327,\n \"acc_norm\": 0.24967405475880053,\n\
\ \"acc_norm_stderr\": 0.011054538377832327\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.16176470588235295,\n \"acc_stderr\": 0.022368672562886754,\n\
\ \"acc_norm\": 0.16176470588235295,\n \"acc_norm_stderr\": 0.022368672562886754\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25163398692810457,\n \"acc_stderr\": 0.017555818091322284,\n \
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.017555818091322284\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.03831305140884601,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.03831305140884601\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2693877551020408,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.2693877551020408,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21686746987951808,\n\
\ \"acc_stderr\": 0.03208284450356365,\n \"acc_norm\": 0.21686746987951808,\n\
\ \"acc_norm_stderr\": 0.03208284450356365\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21637426900584794,\n \"acc_stderr\": 0.03158149539338735,\n\
\ \"acc_norm\": 0.21637426900584794,\n \"acc_norm_stderr\": 0.03158149539338735\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.01507721920066259,\n \"mc2\": 0.3821066604136214,\n\
\ \"mc2_stderr\": 0.015269097668070952\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5737963693764798,\n \"acc_stderr\": 0.013898585965412338\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0075815011372251705,\n \
\ \"acc_stderr\": 0.002389281512077212\n }\n}\n```"
repo_url: https://huggingface.co/vihangd/dopeyshearedplats-1.3b-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|arc:challenge|25_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|gsm8k|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hellaswag|10_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T13-37-34.130815.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-13T13-37-34.130815.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- '**/details_harness|winogrande|5_2023-12-13T13-37-34.130815.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-13T13-37-34.130815.parquet'
- config_name: results
data_files:
- split: 2023_12_13T13_37_34.130815
path:
- results_2023-12-13T13-37-34.130815.parquet
- split: latest
path:
- results_2023-12-13T13-37-34.130815.parquet
---
# Dataset Card for Evaluation run of vihangd/dopeyshearedplats-1.3b-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [vihangd/dopeyshearedplats-1.3b-v1](https://huggingface.co/vihangd/dopeyshearedplats-1.3b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vihangd__dopeyshearedplats-1.3b-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T13:37:34.130815](https://huggingface.co/datasets/open-llm-leaderboard/details_vihangd__dopeyshearedplats-1.3b-v1/blob/main/results_2023-12-13T13-37-34.130815.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26012302704770085,
"acc_stderr": 0.030820336255728206,
"acc_norm": 0.2621303940455793,
"acc_norm_stderr": 0.031589269063273896,
"mc1": 0.2460220318237454,
"mc1_stderr": 0.01507721920066259,
"mc2": 0.3821066604136214,
"mc2_stderr": 0.015269097668070952
},
"harness|arc:challenge|25": {
"acc": 0.3225255972696246,
"acc_stderr": 0.013659980894277368,
"acc_norm": 0.3438566552901024,
"acc_norm_stderr": 0.013880644570156215
},
"harness|hellaswag|10": {
"acc": 0.4848635729934276,
"acc_stderr": 0.004987494455523719,
"acc_norm": 0.6430989842660825,
"acc_norm_stderr": 0.004781061390873926
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2,
"acc_stderr": 0.034554737023254394,
"acc_norm": 0.2,
"acc_norm_stderr": 0.034554737023254394
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3223684210526316,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.3223684210526316,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.25660377358490566,
"acc_stderr": 0.026880647889051958,
"acc_norm": 0.25660377358490566,
"acc_norm_stderr": 0.026880647889051958
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.037455547914624576,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.037455547914624576
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483099,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483099
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.04336432707993177,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.04336432707993177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.03097669299853443,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.03097669299853443
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.21379310344827587,
"acc_stderr": 0.03416520447747549,
"acc_norm": 0.21379310344827587,
"acc_norm_stderr": 0.03416520447747549
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25132275132275134,
"acc_stderr": 0.022340482339643898,
"acc_norm": 0.25132275132275134,
"acc_norm_stderr": 0.022340482339643898
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.036196045241242515,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.036196045241242515
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.27419354838709675,
"acc_stderr": 0.025378139970885196,
"acc_norm": 0.27419354838709675,
"acc_norm_stderr": 0.025378139970885196
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.030903796952114475,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.030903796952114475
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.28484848484848485,
"acc_stderr": 0.03524390844511784,
"acc_norm": 0.28484848484848485,
"acc_norm_stderr": 0.03524390844511784
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.031911782267135466,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.031911782267135466
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.25906735751295334,
"acc_stderr": 0.03161877917935409,
"acc_norm": 0.25906735751295334,
"acc_norm_stderr": 0.03161877917935409
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3128205128205128,
"acc_stderr": 0.023507579020645333,
"acc_norm": 0.3128205128205128,
"acc_norm_stderr": 0.023507579020645333
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.026593939101844082,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.026593939101844082
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.226890756302521,
"acc_stderr": 0.027205371538279483,
"acc_norm": 0.226890756302521,
"acc_norm_stderr": 0.027205371538279483
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389024,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.25871559633027524,
"acc_stderr": 0.01877605231961962,
"acc_norm": 0.25871559633027524,
"acc_norm_stderr": 0.01877605231961962
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39351851851851855,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.39351851851851855,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.03198001660115071,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.03198001660115071
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.21518987341772153,
"acc_stderr": 0.026750826994676166,
"acc_norm": 0.21518987341772153,
"acc_norm_stderr": 0.026750826994676166
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34080717488789236,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.34080717488789236,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.03915345408847834,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.03915345408847834
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2809917355371901,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.2809917355371901,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.22321428571428573,
"acc_stderr": 0.039523019677025116,
"acc_norm": 0.22321428571428573,
"acc_norm_stderr": 0.039523019677025116
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.028605953702004257,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.028605953702004257
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653696,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653696
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2669220945083014,
"acc_stderr": 0.015818450894777573,
"acc_norm": 0.2669220945083014,
"acc_norm_stderr": 0.015818450894777573
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.21676300578034682,
"acc_stderr": 0.022183477668412856,
"acc_norm": 0.21676300578034682,
"acc_norm_stderr": 0.022183477668412856
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2446927374301676,
"acc_stderr": 0.014378169884098407,
"acc_norm": 0.2446927374301676,
"acc_norm_stderr": 0.014378169884098407
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.20261437908496732,
"acc_stderr": 0.023015446877985672,
"acc_norm": 0.20261437908496732,
"acc_norm_stderr": 0.023015446877985672
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.27009646302250806,
"acc_stderr": 0.025218040373410612,
"acc_norm": 0.27009646302250806,
"acc_norm_stderr": 0.025218040373410612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.024659685185967287,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.024659685185967287
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.28368794326241137,
"acc_stderr": 0.02689170942834396,
"acc_norm": 0.28368794326241137,
"acc_norm_stderr": 0.02689170942834396
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24967405475880053,
"acc_stderr": 0.011054538377832327,
"acc_norm": 0.24967405475880053,
"acc_norm_stderr": 0.011054538377832327
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.16176470588235295,
"acc_stderr": 0.022368672562886754,
"acc_norm": 0.16176470588235295,
"acc_norm_stderr": 0.022368672562886754
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.017555818091322284,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.017555818091322284
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2,
"acc_stderr": 0.03831305140884601,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03831305140884601
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2693877551020408,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.2693877551020408,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.03014777593540922,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.03014777593540922
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21686746987951808,
"acc_stderr": 0.03208284450356365,
"acc_norm": 0.21686746987951808,
"acc_norm_stderr": 0.03208284450356365
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21637426900584794,
"acc_stderr": 0.03158149539338735,
"acc_norm": 0.21637426900584794,
"acc_norm_stderr": 0.03158149539338735
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2460220318237454,
"mc1_stderr": 0.01507721920066259,
"mc2": 0.3821066604136214,
"mc2_stderr": 0.015269097668070952
},
"harness|winogrande|5": {
"acc": 0.5737963693764798,
"acc_stderr": 0.013898585965412338
},
"harness|gsm8k|5": {
"acc": 0.0075815011372251705,
"acc_stderr": 0.002389281512077212
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/koakuma_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of koakuma/小悪魔/소악마 (Touhou)
This is the dataset of koakuma/小悪魔/소악마 (Touhou), containing 500 images and their tags.
The core tags of this character are `head_wings, wings, red_hair, long_hair, bat_wings, red_eyes, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 502.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/koakuma_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 336.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/koakuma_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1098 | 665.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/koakuma_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 462.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/koakuma_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1098 | 861.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/koakuma_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/koakuma_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, book, shirt, simple_background, solo, long_sleeves, red_necktie, vest, white_background, looking_at_viewer, skirt_set, black_thighhighs, open_mouth, :d, zettai_ryouiki |
| 1 | 32 |  |  |  |  |  | 1girl, red_necktie, solo, white_shirt, black_vest, collared_shirt, looking_at_viewer, simple_background, bangs, hair_between_eyes, blush, smile, white_background, black_skirt, closed_mouth, juliet_sleeves, upper_body, very_long_hair, cowboy_shot, open_mouth, pointy_ears |
| 2 | 5 |  |  |  |  |  | 1girl, solo, blush, book, red_necktie, one_eye_closed |
| 3 | 11 |  |  |  |  |  | 1girl, book, necktie, solo, black_thighhighs, blush, zettai_ryouiki, demon_tail |
| 4 | 5 |  |  |  |  |  | 1girl, blush, large_breasts, solo, navel, nipples, black_panties, black_thighhighs, demon_tail, underwear_only, bow_panties, bra, lingerie, looking_at_viewer, lying, medium_breasts |
| 5 | 24 |  |  |  |  |  | 1girl, large_breasts, solo, looking_at_viewer, pointy_ears, smile, blush, marker_(medium), very_long_hair, uneven_eyes, curvy, simple_background, white_background, millipen_(medium), navel, cleavage, swimsuit, convenient_censoring, nude |
| 6 | 5 |  |  |  |  |  | 1boy, 1girl, blush, hetero, large_breasts, nipples, open_mouth, sex, solo_focus, vaginal, cowgirl_position, girl_on_top, penis, censored, assertive_female, completely_nude, cum_in_pussy, demon_wings, looking_at_viewer, navel, pink_hair, pointy_ears, pov, saliva, smile, sweat |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | book | shirt | simple_background | solo | long_sleeves | red_necktie | vest | white_background | looking_at_viewer | skirt_set | black_thighhighs | open_mouth | :d | zettai_ryouiki | white_shirt | black_vest | collared_shirt | bangs | hair_between_eyes | blush | smile | black_skirt | closed_mouth | juliet_sleeves | upper_body | very_long_hair | cowboy_shot | pointy_ears | one_eye_closed | necktie | demon_tail | large_breasts | navel | nipples | black_panties | underwear_only | bow_panties | bra | lingerie | lying | medium_breasts | marker_(medium) | uneven_eyes | curvy | millipen_(medium) | cleavage | swimsuit | convenient_censoring | nude | 1boy | hetero | sex | solo_focus | vaginal | cowgirl_position | girl_on_top | penis | censored | assertive_female | completely_nude | cum_in_pussy | demon_wings | pink_hair | pov | saliva | sweat |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------------|:-------|:---------------|:--------------|:-------|:-------------------|:--------------------|:------------|:-------------------|:-------------|:-----|:-----------------|:--------------|:-------------|:-----------------|:--------|:--------------------|:--------|:--------|:--------------|:---------------|:-----------------|:-------------|:-----------------|:--------------|:--------------|:-----------------|:----------|:-------------|:----------------|:--------|:----------|:----------------|:-----------------|:--------------|:------|:-----------|:--------|:-----------------|:------------------|:--------------|:--------|:--------------------|:-----------|:-----------|:-----------------------|:-------|:-------|:---------|:------|:-------------|:----------|:-------------------|:--------------|:--------|:-----------|:-------------------|:------------------|:---------------|:--------------|:------------|:------|:---------|:--------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 32 |  |  |  |  |  | X | | | X | X | | X | | X | X | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | | | X | | X | | | | | | | | | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | X | X | | | X | | | | | | | X | | | X | | | | | | X | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | | X | | | | | X | | X | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 24 |  |  |  |  |  | X | | | X | X | | | | X | X | | | | | | | | | | | X | X | | | | | X | | X | | | | X | X | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | | | | | | | | X | | | X | | | | | | | | X | X | | | | | | | X | | | | X | X | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
liuyanchen1015/VALUE_mnli_been_done | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: train
num_bytes: 11563230
num_examples: 48515
- name: dev_matched
num_bytes: 290459
num_examples: 1226
- name: dev_mismatched
num_bytes: 377910
num_examples: 1509
- name: test_matched
num_bytes: 296760
num_examples: 1199
- name: test_mismatched
num_bytes: 380324
num_examples: 1541
download_size: 8136354
dataset_size: 12908683
---
# Dataset Card for "VALUE2_mnli_been_done"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/mikazuki_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mikazuki/三日月/三日月 (Azur Lane)
This is the dataset of mikazuki/三日月/三日月 (Azur Lane), containing 40 images and their tags.
The core tags of this character are `animal_ears, blue_eyes, long_hair, hat, very_long_hair, tail, blue_hair, bangs, school_hat, squirrel_ears, hair_between_eyes, bow, squirrel_tail, braid`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 40 | 35.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mikazuki_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 40 | 22.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mikazuki_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 78 | 46.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mikazuki_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 40 | 33.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mikazuki_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 78 | 62.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mikazuki_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mikazuki_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, solo, taiyaki, eating, blush, open_mouth, striped_thighhighs, looking_at_viewer, ribbon, blue_dress, chibi, green_hair, white_background |
| 1 | 10 |  |  |  |  |  | 1girl, detached_sleeves, ofuda, qing_guanmao, blush, solo, bare_shoulders, blue_dress, long_sleeves, bandaged_leg, blue_sleeves, jiangshi, looking_at_viewer, sleeves_past_fingers, torn_sleeves, blue_headwear, halloween, parted_lips, sitting, sleeveless_dress, twintails, barefoot, sash, torn_dress, white_background |
| 2 | 5 |  |  |  |  |  | blush, 1girl, blue_dress, chibi, holding_food, white_background, white_sailor_collar, bare_shoulders, ears_through_headwear, eating, off_shoulder, red_bow, solo, taiyaki, :t, closed_mouth, collarbone, food_on_face, frills, hair_bow, parted_lips, puffy_long_sleeves, red_neckerchief, sailor_dress, shirt, single_braid, sleeveless_dress, sleeves_past_wrists, yellow_headwear |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | taiyaki | eating | blush | open_mouth | striped_thighhighs | looking_at_viewer | ribbon | blue_dress | chibi | green_hair | white_background | detached_sleeves | ofuda | qing_guanmao | bare_shoulders | long_sleeves | bandaged_leg | blue_sleeves | jiangshi | sleeves_past_fingers | torn_sleeves | blue_headwear | halloween | parted_lips | sitting | sleeveless_dress | twintails | barefoot | sash | torn_dress | holding_food | white_sailor_collar | ears_through_headwear | off_shoulder | red_bow | :t | closed_mouth | collarbone | food_on_face | frills | hair_bow | puffy_long_sleeves | red_neckerchief | sailor_dress | shirt | single_braid | sleeves_past_wrists | yellow_headwear |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:----------|:---------|:--------|:-------------|:---------------------|:--------------------|:---------|:-------------|:--------|:-------------|:-------------------|:-------------------|:--------|:---------------|:-----------------|:---------------|:---------------|:---------------|:-----------|:-----------------------|:---------------|:----------------|:------------|:--------------|:----------|:-------------------|:------------|:-----------|:-------|:-------------|:---------------|:----------------------|:------------------------|:---------------|:----------|:-----|:---------------|:-------------|:---------------|:---------|:-----------|:---------------------|:------------------|:---------------|:--------|:---------------|:----------------------|:------------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | | | X | | | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | X | | | | | X | X | | X | | | | X | | | | | | | | | X | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Babypotatotang/lld-onlyicon | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 196407715.578
num_examples: 14959
- name: test
num_bytes: 49103652.04
num_examples: 3740
download_size: 156823150
dataset_size: 245511367.618
---
# Dataset Card for "lld-onlyicon"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
micsell/hebrew_kan_sentence90000 | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: id
dtype: string
- name: language
dtype: string
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 1900103768.0
num_examples: 10000
download_size: 1899330575
dataset_size: 1900103768.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
TaylorAI/user_queries_dataset | ---
dataset_info:
features:
- name: query
dtype: string
- name: lang
dtype: string
splits:
- name: train
num_bytes: 508571326
num_examples: 1493076
download_size: 332374686
dataset_size: 508571326
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gigant/ted_descriptions | ---
annotations_creators:
- no-annotation
language:
- en
language_creators:
- found
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: TED descriptions
size_categories:
- 1K<n<10K
source_datasets:
- original
tags: []
task_categories:
- text-generation
task_ids:
- language-modeling
dataset_info:
features:
- name: url
dtype: string
- name: descr
dtype: string
splits:
- name: train
num_bytes: 2617778
num_examples: 5705
download_size: 1672988
dataset_size: 2617778
---
# Dataset Card for TED descriptions
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Eitanli/allergy_type_bu | ---
dataset_info:
features:
- name: id
dtype: int64
- name: recipe
dtype: string
- name: allergy_type
dtype: string
splits:
- name: train
num_bytes: 108603536
num_examples: 74465
download_size: 55013888
dataset_size: 108603536
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "allergy_type_bu"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Opit/bulgarian_tts | ---
license: mit
dataset_info:
features:
- name: audio
dtype: audio
- name: transcript
dtype: string
- name: language
dtype: string
- name: speaker
dtype: int64
splits:
- name: train
num_bytes: 1533274418.51
num_examples: 4114
download_size: 2999177977
dataset_size: 1533274418.51
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
Source: https://github.com/vislupus/Bulgarian-TTS-dataset/ |
nuprl/ts-training | ---
dataset_info:
features:
- name: hexsha
dtype: string
- name: size
dtype: int64
- name: ext
dtype: string
- name: lang
dtype: string
- name: max_stars_repo_path
dtype: string
- name: max_stars_repo_name
dtype: string
- name: max_stars_repo_head_hexsha
dtype: string
- name: max_stars_repo_licenses
sequence: string
- name: max_stars_count
dtype: float64
- name: max_stars_repo_stars_event_min_datetime
dtype: string
- name: max_stars_repo_stars_event_max_datetime
dtype: string
- name: max_issues_repo_path
dtype: string
- name: max_issues_repo_name
dtype: string
- name: max_issues_repo_head_hexsha
dtype: string
- name: max_issues_repo_licenses
sequence: string
- name: max_issues_count
dtype: float64
- name: max_issues_repo_issues_event_min_datetime
dtype: string
- name: max_issues_repo_issues_event_max_datetime
dtype: string
- name: max_forks_repo_path
dtype: string
- name: max_forks_repo_name
dtype: string
- name: max_forks_repo_head_hexsha
dtype: string
- name: max_forks_repo_licenses
sequence: string
- name: max_forks_count
dtype: float64
- name: max_forks_repo_forks_event_min_datetime
dtype: string
- name: max_forks_repo_forks_event_max_datetime
dtype: string
- name: content
dtype: string
- name: avg_line_length
dtype: float64
- name: max_line_length
dtype: int64
- name: alphanum_fraction
dtype: float64
splits:
- name: train
num_bytes: 42270977435
num_examples: 12133148
download_size: 17360072228
dataset_size: 42270977435
extra_gated_prompt: |-
## Terms of Use for The Stack
The Stack dataset is a collection of source code in over 300 programming languages. We ask that you read and acknowledge the following points before using the dataset:
1. The Stack is a collection of source code from repositories with various licenses. Any use of all or part of the code gathered in The Stack must abide by the terms of the original licenses, including attribution clauses when relevant. We facilitate this by providing provenance information for each data point.
2. The Stack is regularly updated to enact validated data removal requests. By clicking on "Access repository", you agree to update your own version of The Stack to the most recent usable version specified by the maintainers in [the following thread](https://huggingface.co/datasets/bigcode/the-stack/discussions/7). If you have questions about dataset versions and allowed uses, please also ask them in the dataset’s [community discussions](https://huggingface.co/datasets/bigcode/the-stack/discussions/new). We will also notify users via email when the latest usable version changes.
3. To host, share, or otherwise provide access to The Stack dataset, you must include [these Terms of Use](https://huggingface.co/datasets/bigcode/the-stack#terms-of-use-for-the-stack) and require users to agree to it.
By clicking on "Access repository" below, you accept that your contact information (email address and username) can be shared with the dataset maintainers as well.
extra_gated_fields:
Email: text
I have read the License and agree with its terms: checkbox
---
# Dataset Card for "ts-training"
This is a subset of the TypeScript portion of [The Stack (dedup)](https://huggingface.co/datasets/bigcode/the-stack-dedup), uploaded to the Hugging Face Hub for convenience.
Files with dates _after_ the December 31, 2021 cutoff are excluded from this dataset, since we are using those files for evaluation. Therefore, the remaining files (in this dataset) are available for training.
A file is considered to be after the cutoff if all of `max_{stars|forks|issues}_repo_{stars|forks|issues}_event_min_datetime` (i.e., the first timestamp for a `{stars|forks|issues}` event) are after the cutoff. Otherwise (or if all timestamps are missing), the file is included in this dataset.
## Versions
The default version (`main`) is current `v1.1`.
|Version|Description|
|-|-|
|`v1.1` | Original version of the training dataset, based on v1.1 of the Stack. Applies the training cutoff (December 31, 2021). Used to train OpenTau. |
|`v1.1full` | Training dataset based on v1.1 of the Stack. Does not apply the training cutoff (December 31, 2021), but applies a filter to remove files that do not parse as valid TypeScript. |
|`v1.1p1` | Revision of v1.1. Applies a filter to remove files that do not parse as valid TypeScript. |
|
npvinHnivqn/EnglishDictionary | ---
license: afl-3.0
task_categories:
- token-classification
language:
- en
size_categories:
- 100K<n<1M
--- |
LiukG/gut_phage_and_metagenomic | ---
configs:
- config_name: gut_1024_mini
data_files:
- split: train
path: gut_1024_mini/train.csv
- split: test
path: gut_1024_mini/test.csv
- config_name: gut_1024
data_files:
- split: train
path: gut_1024/train.csv
- split: test
path: gut_1024/test.csv
- config_name: gut_6000
data_files:
- split: train
path: gut_6000/train.csv
- split: test
path: gut_6000/test.csv
- config_name: gut_36000
data_files:
- split: train
path: gut_36000/train.csv
- split: test
path: gut_36000/test.csv
task_categories:
- text-classification
tags:
- biology
pretty_name: bacteriophages and metagenomics
size_categories:
- 10K<n<100K
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
creative-graphic-design/CGL-Dataset-v2 | ---
annotations_creators:
- crowdsourced
language:
- zh
language_creators:
- found
license:
- unknown
multilinguality:
- monolingual
pretty_name: CGL-Dataset v2
size_categories: []
source_datasets:
- CGL-Dataset
tags:
- graphic design
task_categories:
- other
task_ids: []
---
# Dataset Card for CGL-Dataset-v2
[](https://github.com/shunk031/huggingface-datasets_CGL-Dataset-v2/actions/workflows/ci.yaml)
[](https://github.com/shunk031/huggingface-datasets_CGL-Dataset-v2/actions/workflows/push_to_hub.yaml)
## Table of Contents
- [Dataset Card Creation Guide](#dataset-card-creation-guide)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/liuan0803/RADM
- **Repository:** https://github.com/shunk031/huggingface-datasets_CGL-Dataset-v2
- **Paper (Preprint):** https://arxiv.org/abs/2306.09086
- **Paper (CIKM'23):** https://dl.acm.org/doi/10.1145/3583780.3615028
### Dataset Summary
CGL-Dataset V2 is a dataset for the task of automatic graphic layout design of advertising posters, containing 60,548 training samples and 1035 testing samples. It is an extension of CGL-Dataset.
### Supported Tasks and Leaderboards
[More Information Needed]
<!-- For each of the tasks tagged for this dataset, give a brief description of the tag, metrics, and suggested models (with a link to their HuggingFace implementation if available). Give a similar description of tasks that were not covered by the structured tag set (repace the `task-category-tag` with an appropriate `other:other-task-name`).
- `task-category-tag`: The dataset can be used to train a model for [TASK NAME], which consists in [TASK DESCRIPTION]. Success on this task is typically measured by achieving a *high/low* [metric name](https://huggingface.co/metrics/metric_name). The ([model name](https://huggingface.co/model_name) or [model class](https://huggingface.co/transformers/model_doc/model_class.html)) model currently achieves the following score. *[IF A LEADERBOARD IS AVAILABLE]:* This task has an active leaderboard which can be found at [leaderboard url]() and ranks models based on [metric name](https://huggingface.co/metrics/metric_name) while also reporting [other metric name](https://huggingface.co/metrics/other_metric_name). -->
### Languages
The language data in CGL-Dataset v2 is in Chinese ([BCP-47 zh](https://www.rfc-editor.org/info/bcp47)).
## Dataset Structure
### Data Instances
To use CGL-Dataset v2 dataset, you need to download `RADM_dataset.tar.gz` that includes the poster image, text and text features via [JD Cloud](https://3.cn/10-dQKDKG) or [Google Drive](https://drive.google.com/file/d/1ezOzR7MX3MFFIfWgJmmEaqXn3iDFp2si/view?usp=sharing).
Then place the downloaded files in the following structure and specify its path.
```shell
/path/to/datasets
└── RADM_dataset.tar.gz
```
```python
import datasets as ds
dataset = ds.load_dataset(
path="shunk031/CGL-Dataset-v2",
data_dir="/path/to/datasets/RADM_dataset.tar.gz",
decode_rle=True, # True if Run-length Encoding (RLE) is to be decoded and converted to binary mask.
include_text_features=True, # True if RoBERTa-based text feature is to be loaded.
)
```
### Data Fields
[More Information Needed]
<!-- List and describe the fields present in the dataset. Mention their data type, and whether they are used as input or output in any of the tasks the dataset currently supports. If the data has span indices, describe their attributes, such as whether they are at the character level or word level, whether they are contiguous or not, etc. If the datasets contains example IDs, state whether they have an inherent meaning, such as a mapping to other datasets or pointing to relationships between data points.
- `example_field`: description of `example_field`
Note that the descriptions can be initialized with the **Show Markdown Data Fields** output of the [Datasets Tagging app](https://huggingface.co/spaces/huggingface/datasets-tagging), you will then only need to refine the generated descriptions. -->
### Data Splits
[More Information Needed]
<!-- Describe and name the splits in the dataset if there are more than one.
Describe any criteria for splitting the data, if used. If there are differences between the splits (e.g. if the training annotations are machine-generated and the dev and test ones are created by humans, or if different numbers of annotators contributed to each example), describe them here.
Provide the sizes of each split. As appropriate, provide any descriptive statistics for the features, such as average length. For example:
| | train | validation | test |
|-------------------------|------:|-----------:|-----:|
| Input Sentences | | | |
| Average Sentence Length | | | | -->
## Dataset Creation
### Curation Rationale
[More Information Needed]
<!-- What need motivated the creation of this dataset? What are some of the reasons underlying the major choices involved in putting it together? -->
### Source Data
[More Information Needed]
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences,...) -->
#### Initial Data Collection and Normalization
[More Information Needed]
<!-- Describe the data collection process. Describe any criteria for data selection or filtering. List any key words or search terms used. If possible, include runtime information for the collection process.
If data was collected from other pre-existing datasets, link to source here and to their [Hugging Face version](https://huggingface.co/datasets/dataset_name).
If the data was modified or normalized after being collected (e.g. if the data is word-tokenized), describe the process and the tools used. -->
#### Who are the source language producers?
[More Information Needed]
<!-- State whether the data was produced by humans or machine generated. Describe the people or systems who originally created the data.
If available, include self-reported demographic or identity information for the source data creators, but avoid inferring this information. Instead state that this information is unknown. See [Larson 2017](https://www.aclweb.org/anthology/W17-1601.pdf) for using identity categories as a variables, particularly gender.
Describe the conditions under which the data was created (for example, if the producers were crowdworkers, state what platform was used, or if the data was found, what website the data was found on). If compensation was provided, include that information here.
Describe other people represented or mentioned in the data. Where possible, link to references for the information. -->
### Annotations
[More Information Needed]
<!-- If the dataset contains annotations which are not part of the initial data collection, describe them in the following paragraphs. -->
#### Annotation process
[More Information Needed]
<!-- If applicable, describe the annotation process and any tools used, or state otherwise. Describe the amount of data annotated, if not all. Describe or reference annotation guidelines provided to the annotators. If available, provide interannotator statistics. Describe any annotation validation processes. -->
#### Who are the annotators?
[More Information Needed]
<!-- If annotations were collected for the source data (such as class labels or syntactic parses), state whether the annotations were produced by humans or machine generated.
Describe the people or systems who originally created the annotations and their selection criteria if applicable.
If available, include self-reported demographic or identity information for the annotators, but avoid inferring this information. Instead state that this information is unknown. See [Larson 2017](https://www.aclweb.org/anthology/W17-1601.pdf) for using identity categories as a variables, particularly gender.
Describe the conditions under which the data was annotated (for example, if the annotators were crowdworkers, state what platform was used, or if the data was found, what website the data was found on). If compensation was provided, include that information here. -->
### Personal and Sensitive Information
[More Information Needed]
<!-- State whether the dataset uses identity categories and, if so, how the information is used. Describe where this information comes from (i.e. self-reporting, collecting from profiles, inferring, etc.). See [Larson 2017](https://www.aclweb.org/anthology/W17-1601.pdf) for using identity categories as a variables, particularly gender. State whether the data is linked to individuals and whether those individuals can be identified in the dataset, either directly or indirectly (i.e., in combination with other data).
State whether the dataset contains other data that might be considered sensitive (e.g., data that reveals racial or ethnic origins, sexual orientations, religious beliefs, political opinions or union memberships, or locations; financial or health data; biometric or genetic data; forms of government identification, such as social security numbers; criminal history).
If efforts were made to anonymize the data, describe the anonymization process. -->
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
<!-- Please discuss some of the ways you believe the use of this dataset will impact society.
The statement should include both positive outlooks, such as outlining how technologies developed through its use may improve people's lives, and discuss the accompanying risks. These risks may range from making important decisions more opaque to people who are affected by the technology, to reinforcing existing harmful biases (whose specifics should be discussed in the next section), among other considerations.
Also describe in this section if the proposed dataset contains a low-resource or under-represented language. If this is the case or if this task has any impact on underserved communities, please elaborate here. -->
### Discussion of Biases
[More Information Needed]
<!-- Provide descriptions of specific biases that are likely to be reflected in the data, and state whether any steps were taken to reduce their impact.
For Wikipedia text, see for example [Dinan et al 2020 on biases in Wikipedia (esp. Table 1)](https://arxiv.org/abs/2005.00614), or [Blodgett et al 2020](https://www.aclweb.org/anthology/2020.acl-main.485/) for a more general discussion of the topic.
If analyses have been run quantifying these biases, please add brief summaries and links to the studies here. -->
### Other Known Limitations
[More Information Needed]
<!-- If studies of the datasets have outlined other limitations of the dataset, such as annotation artifacts, please outline and cite them here. -->
## Additional Information
### Dataset Curators
[More Information Needed]
<!-- List the people involved in collecting the dataset and their affiliation(s). If funding information is known, include it here. -->
### Licensing Information
[More Information Needed]
<!-- Provide the license and link to the license webpage if available. -->
### Citation Information
<!-- Provide the [BibTex](http://www.bibtex.org/)-formatted reference for the dataset. For example:
```
@article{article_id,
author = {Author List},
title = {Dataset Paper Title},
journal = {Publication Venue},
year = {2525}
}
```
If the dataset has a [DOI](https://www.doi.org/), please provide it here. -->
```bibtex
@inproceedings{li2023relation,
title={Relation-Aware Diffusion Model for Controllable Poster Layout Generation},
author={Li, Fengheng and Liu, An and Feng, Wei and Zhu, Honghe and Li, Yaoyu and Zhang, Zheng and Lv, Jingjing and Zhu, Xin and Shen, Junjie and Lin, Zhangang},
booktitle={Proceedings of the 32nd ACM international conference on information & knowledge management},
pages={1249--1258},
year={2023}
}
```
### Contributions
Thanks to [@liuan0803](https://github.com/liuan0803) for creating this dataset.
|
muhammadravi251001/idk_mrc_nli_ner | ---
license: openrail
---
You can download this Dataset just like this:
```
data_files = {"train": "data_nli_train_ner_df.csv",
"validation": "data_nli_val_ner_df.csv",
"test": "data_nli_test_ner_df.csv"}
dataset = load_dataset("muhammadravi251001/idk_mrc_nli_ner", data_files=data_files)
```
This is some modification from IDK-MRC dataset to IDK-MRC-NLI dataset. By convert QAS dataset to NLI dataset. You can find the original IDK-MRC in this link: https://huggingface.co/datasets/rifkiaputri/idk-mrc.
### Citation Information
```bibtex
@inproceedings{putri-oh-2022-idk,
title = "{IDK}-{MRC}: Unanswerable Questions for {I}ndonesian Machine Reading Comprehension",
author = "Putri, Rifki Afina and
Oh, Alice",
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2022",
address = "Abu Dhabi, United Arab Emirates",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.emnlp-main.465",
pages = "6918--6933",
}
``` |
benayas/snips_artificial_10pct_v1 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1123974
num_examples: 13084
download_size: 412460
dataset_size: 1123974
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
SEACrowd/nusatranslation_emot | ---
tags:
- emotion-classification
language:
- abs
- btk
- bew
- bug
- jav
- mad
- mak
- min
- mui
- rej
- sun
---
# nusatranslation_emot
Democratizing access to natural language processing (NLP) technology is crucial, especially for underrepresented and extremely low-resource languages. Previous research has focused on developing labeled and unlabeled corpora for these languages through online scraping and document translation. While these methods have proven effective and cost-efficient, we have identified limitations in the resulting corpora, including a lack of lexical diversity and cultural relevance to local communities. To address this gap, we conduct a case study on Indonesian local languages. We compare the effectiveness of online scraping, human translation, and paragraph writing by native speakers in constructing datasets. Our findings demonstrate that datasets generated through paragraph writing by native speakers exhibit superior quality in terms of lexical diversity and cultural content. In addition, we present the NusaWrites benchmark, encompassing 12 underrepresented and extremely low-resource languages spoken by millions of individuals in Indonesia. Our empirical experiment results using existing multilingual large language models conclude the need to extend these models to more underrepresented languages.
We introduce a novel high quality human curated corpora, i.e., NusaMenulis, which covers 12 languages spoken in Indonesia. The resource extend the coverage of languages to 5 new languages, i.e., Ambon (abs), Bima (bhp), Makassarese (mak), Palembang / Musi (mui), and Rejang (rej).
For the rhetoric mode classification task, we cover 5 rhetoric modes, i.e., narrative, persuasive, argumentative, descriptive, and expository.
## Dataset Usage
Run `pip install nusacrowd` before loading the dataset through HuggingFace's `load_dataset`.
## Citation
```
@unpublished{anonymous2023nusawrites:,
title={NusaWrites: Constructing High-Quality Corpora for Underrepresented and Extremely Low-Resource Languages},
author={Anonymous},
journal={OpenReview Preprint},
year={2023},
note={anonymous preprint under review}
}
```
## License
Creative Commons Attribution Share-Alike 4.0 International
## Homepage
[https://github.com/IndoNLP/nusa-writes](https://github.com/IndoNLP/nusa-writes)
### NusaCatalogue
For easy indexing and metadata: [https://indonlp.github.io/nusa-catalogue](https://indonlp.github.io/nusa-catalogue) |
ZhangShenao/0.0001_idpo_noreplacerej_decalpha_dataset | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: score_chosen
dtype: float64
- name: score_rejected
dtype: float64
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: is_better
dtype: bool
splits:
- name: test_prefs_2
num_bytes: 13647690
num_examples: 2000
- name: train_prefs_2
num_bytes: 141047004
num_examples: 20378
download_size: 86040274
dataset_size: 154694694
configs:
- config_name: default
data_files:
- split: test_prefs_2
path: data/test_prefs_2-*
- split: train_prefs_2
path: data/train_prefs_2-*
---
# Dataset Card for "0.0001_idpo_noreplacerej_decalpha_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NeuroSenko/senko-arts-by-rimukoro-512x512 | ---
license: mit
tags:
- Senko
---
## Description
This dataset contains images of Senko-san which were drawn by Rimukoro. All images are cropped up to 512x512 and every image contains txt file with tags list which were extracted from one of booru site.
## Examples

 |
TinyPixel/elm-sys | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2591217
num_examples: 1073
download_size: 1394627
dataset_size: 2591217
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_rizla__rizla-11 | ---
pretty_name: Evaluation run of rizla/rizla-11
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [rizla/rizla-11](https://huggingface.co/rizla/rizla-11) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rizla__rizla-11\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-12T01:26:02.401376](https://huggingface.co/datasets/open-llm-leaderboard/details_rizla__rizla-11/blob/main/results_2024-02-12T01-26-02.401376.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23196194129343728,\n\
\ \"acc_stderr\": 0.029934654752561563,\n \"acc_norm\": 0.2314240573187148,\n\
\ \"acc_norm_stderr\": 0.03071122006512167,\n \"mc1\": 1.0,\n \
\ \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n\
\ },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22696245733788395,\n\
\ \"acc_stderr\": 0.012240491536132861,\n \"acc_norm\": 0.22696245733788395,\n\
\ \"acc_norm_stderr\": 0.012240491536132861\n },\n \"harness|hellaswag|10\"\
: {\n \"acc\": 0.2504481179047998,\n \"acc_stderr\": 0.004323856300539177,\n\
\ \"acc_norm\": 0.2504481179047998,\n \"acc_norm_stderr\": 0.004323856300539177\n\
\ },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\"\
: {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n\
\ \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n\
\ },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n\
\ \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n\
\ \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n\
\ \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n\
\ \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\"\
: {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n\
\ \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n\
\ },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\":\
\ 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n\
\ \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n\
\ \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n\
\ \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\
\ 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"\
acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n\
\ \"mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"\
acc\": 0.4956590370955012,\n \"acc_stderr\": 0.014051956064076911\n },\n\
\ \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n\
\ }\n}\n```"
repo_url: https://huggingface.co/rizla/rizla-11
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|arc:challenge|25_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|gsm8k|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hellaswag|10_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T01-26-02.401376.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-12T01-26-02.401376.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- '**/details_harness|winogrande|5_2024-02-12T01-26-02.401376.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-12T01-26-02.401376.parquet'
- config_name: results
data_files:
- split: 2024_02_12T01_26_02.401376
path:
- results_2024-02-12T01-26-02.401376.parquet
- split: latest
path:
- results_2024-02-12T01-26-02.401376.parquet
---
# Dataset Card for Evaluation run of rizla/rizla-11
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [rizla/rizla-11](https://huggingface.co/rizla/rizla-11) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_rizla__rizla-11",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-12T01:26:02.401376](https://huggingface.co/datasets/open-llm-leaderboard/details_rizla__rizla-11/blob/main/results_2024-02-12T01-26-02.401376.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23196194129343728,
"acc_stderr": 0.029934654752561563,
"acc_norm": 0.2314240573187148,
"acc_norm_stderr": 0.03071122006512167,
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.22696245733788395,
"acc_stderr": 0.012240491536132861,
"acc_norm": 0.22696245733788395,
"acc_norm_stderr": 0.012240491536132861
},
"harness|hellaswag|10": {
"acc": 0.2504481179047998,
"acc_stderr": 0.004323856300539177,
"acc_norm": 0.2504481179047998,
"acc_norm_stderr": 0.004323856300539177
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.4956590370955012,
"acc_stderr": 0.014051956064076911
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-16000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 657044
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yangyz1230/H3K4me1_not_filtered | ---
dataset_info:
features:
- name: name
dtype: string
- name: sequence
dtype: string
- name: chrom
dtype: string
- name: start
dtype: int64
- name: end
dtype: int64
- name: strand
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 446604
num_examples: 812
- name: test
num_bytes: 64057
num_examples: 117
download_size: 245515
dataset_size: 510661
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_Kukedlc__NeuralKrishna-7B-slerp | ---
pretty_name: Evaluation run of Kukedlc/NeuralKrishna-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kukedlc/NeuralKrishna-7B-slerp](https://huggingface.co/Kukedlc/NeuralKrishna-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kukedlc__NeuralKrishna-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-18T19:36:58.090168](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__NeuralKrishna-7B-slerp/blob/main/results_2024-02-18T19-36-58.090168.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6523555584548738,\n\
\ \"acc_stderr\": 0.032115635893447195,\n \"acc_norm\": 0.651851664471654,\n\
\ \"acc_norm_stderr\": 0.03278312187714774,\n \"mc1\": 0.6009791921664627,\n\
\ \"mc1_stderr\": 0.017142825728496763,\n \"mc2\": 0.7429427247081414,\n\
\ \"mc2_stderr\": 0.014371578296188414\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7081911262798635,\n \"acc_stderr\": 0.013284525292403515,\n\
\ \"acc_norm\": 0.734641638225256,\n \"acc_norm_stderr\": 0.012902554762313962\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7164907388966342,\n\
\ \"acc_stderr\": 0.004497803024345146,\n \"acc_norm\": 0.8895638319059949,\n\
\ \"acc_norm_stderr\": 0.0031279207383941043\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.032400380867927465,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.032400380867927465\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652457,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652457\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3973509933774834,\n \"acc_stderr\": 0.039955240076816806,\n \"\
acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.039955240076816806\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.0466951066387519,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.0466951066387519\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128137,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128137\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4201117318435754,\n\
\ \"acc_stderr\": 0.016507671073256402,\n \"acc_norm\": 0.4201117318435754,\n\
\ \"acc_norm_stderr\": 0.016507671073256402\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47392438070404175,\n\
\ \"acc_stderr\": 0.012752858346533126,\n \"acc_norm\": 0.47392438070404175,\n\
\ \"acc_norm_stderr\": 0.012752858346533126\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6009791921664627,\n\
\ \"mc1_stderr\": 0.017142825728496763,\n \"mc2\": 0.7429427247081414,\n\
\ \"mc2_stderr\": 0.014371578296188414\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8326756116811366,\n \"acc_stderr\": 0.010490608806828075\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7012888551933283,\n \
\ \"acc_stderr\": 0.012607137125693633\n }\n}\n```"
repo_url: https://huggingface.co/Kukedlc/NeuralKrishna-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|arc:challenge|25_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|gsm8k|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hellaswag|10_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T19-36-58.090168.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T19-36-58.090168.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- '**/details_harness|winogrande|5_2024-02-18T19-36-58.090168.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-18T19-36-58.090168.parquet'
- config_name: results
data_files:
- split: 2024_02_18T19_36_58.090168
path:
- results_2024-02-18T19-36-58.090168.parquet
- split: latest
path:
- results_2024-02-18T19-36-58.090168.parquet
---
# Dataset Card for Evaluation run of Kukedlc/NeuralKrishna-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kukedlc/NeuralKrishna-7B-slerp](https://huggingface.co/Kukedlc/NeuralKrishna-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kukedlc__NeuralKrishna-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-18T19:36:58.090168](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__NeuralKrishna-7B-slerp/blob/main/results_2024-02-18T19-36-58.090168.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6523555584548738,
"acc_stderr": 0.032115635893447195,
"acc_norm": 0.651851664471654,
"acc_norm_stderr": 0.03278312187714774,
"mc1": 0.6009791921664627,
"mc1_stderr": 0.017142825728496763,
"mc2": 0.7429427247081414,
"mc2_stderr": 0.014371578296188414
},
"harness|arc:challenge|25": {
"acc": 0.7081911262798635,
"acc_stderr": 0.013284525292403515,
"acc_norm": 0.734641638225256,
"acc_norm_stderr": 0.012902554762313962
},
"harness|hellaswag|10": {
"acc": 0.7164907388966342,
"acc_stderr": 0.004497803024345146,
"acc_norm": 0.8895638319059949,
"acc_norm_stderr": 0.0031279207383941043
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652457,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652457
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.039955240076816806,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.039955240076816806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.0466951066387519,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.0466951066387519
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128137,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4201117318435754,
"acc_stderr": 0.016507671073256402,
"acc_norm": 0.4201117318435754,
"acc_norm_stderr": 0.016507671073256402
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47392438070404175,
"acc_stderr": 0.012752858346533126,
"acc_norm": 0.47392438070404175,
"acc_norm_stderr": 0.012752858346533126
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6009791921664627,
"mc1_stderr": 0.017142825728496763,
"mc2": 0.7429427247081414,
"mc2_stderr": 0.014371578296188414
},
"harness|winogrande|5": {
"acc": 0.8326756116811366,
"acc_stderr": 0.010490608806828075
},
"harness|gsm8k|5": {
"acc": 0.7012888551933283,
"acc_stderr": 0.012607137125693633
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
johannes-garstenauer/l_cls_labelled_from_distilbert_masking_heaps | ---
dataset_info:
features:
- name: last_cls
sequence: float32
- name: label
dtype: int64
splits:
- name: train
num_bytes: 3084000
num_examples: 1000
download_size: 0
dataset_size: 3084000
---
# Dataset Card for "l_cls_labelled_from_distilbert_masking_heaps"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lamaabdulaziz/processed_MARBERT_dataset | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: special_tokens_mask
sequence: int8
splits:
- name: train
num_bytes: 6023251.0
num_examples: 12
download_size: 999363
dataset_size: 6023251.0
---
# Dataset Card for "processed_MARBERT_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pbaoo2705/cpgqa_processed-2 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: answer
dtype: string
- name: start_positions
dtype: int64
- name: end_positions
dtype: int64
splits:
- name: train
num_bytes: 9148601
num_examples: 884
download_size: 190231
dataset_size: 9148601
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "cpgqa_processed-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
316usman/my_dataset | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 31
num_examples: 1
download_size: 1349
dataset_size: 31
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "my_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/med_alpaca_standardized_cluster_66_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 21256708
num_examples: 14312
download_size: 11157765
dataset_size: 21256708
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_66_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zolak/twitter_dataset_80_1713218140 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 532471
num_examples: 1253
download_size: 277208
dataset_size: 532471
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/yamashiro_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yamashiro/山城/山城 (Kantai Collection)
This is the dataset of yamashiro/山城/山城 (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `black_hair, red_eyes, hair_ornament, short_hair, breasts, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 515.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yamashiro_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 347.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yamashiro_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1172 | 711.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yamashiro_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 477.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yamashiro_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1172 | 916.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yamashiro_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yamashiro_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1boy, 1girl, hetero, nipples, sex, solo_focus, vaginal, blush, navel, penis, spread_legs, on_back, completely_nude, missionary, one_eye_closed, open_mouth, pov, pussy, sweat |
| 1 | 11 |  |  |  |  |  | 1girl, bare_shoulders, detached_sleeves, japanese_clothes, looking_at_viewer, nontraditional_miko, solo, skirt |
| 2 | 5 |  |  |  |  |  | 1girl, bare_shoulders, detached_sleeves, japanese_clothes, nontraditional_miko, solo, turret, cannon, skirt, open_mouth |
| 3 | 14 |  |  |  |  |  | 1girl, japanese_clothes, solo, upper_body, detached_sleeves, looking_at_viewer, nontraditional_miko, headgear, simple_background, smile, wide_sleeves |
| 4 | 5 |  |  |  |  |  | alternate_costume, blush, looking_at_viewer, white_gloves, 1girl, bare_shoulders, cleavage, elbow_gloves, necklace, smile, solo, wedding_dress, white_dress, collarbone, wedding_ring, character_name, flower, petals, strapless_dress, very_long_hair |
| 5 | 5 |  |  |  |  |  | 1girl, alternate_costume, looking_at_viewer, obi, solo, floral_print, long_hair, smile, wide_sleeves, hair_flower, open_mouth, white_kimono, yukata |
| 6 | 6 |  |  |  |  |  | day, blue_sky, cleavage, ocean, outdoors, beach, cloud, open_mouth, sarong, 1girl, 2girls, blush, looking_at_viewer, navel, solo_focus, white_bikini |
| 7 | 7 |  |  |  |  |  | 1girl, blush, alternate_costume, solo, looking_at_viewer, black_sailor_collar, red_neckerchief, simple_background, black_serafuku, black_skirt, hair_between_eyes, pleated_skirt, upper_body, white_background, white_shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1boy | 1girl | hetero | nipples | sex | solo_focus | vaginal | blush | navel | penis | spread_legs | on_back | completely_nude | missionary | one_eye_closed | open_mouth | pov | pussy | sweat | bare_shoulders | detached_sleeves | japanese_clothes | looking_at_viewer | nontraditional_miko | solo | skirt | turret | cannon | upper_body | headgear | simple_background | smile | wide_sleeves | alternate_costume | white_gloves | cleavage | elbow_gloves | necklace | wedding_dress | white_dress | collarbone | wedding_ring | character_name | flower | petals | strapless_dress | very_long_hair | obi | floral_print | long_hair | hair_flower | white_kimono | yukata | day | blue_sky | ocean | outdoors | beach | cloud | sarong | 2girls | white_bikini | black_sailor_collar | red_neckerchief | black_serafuku | black_skirt | hair_between_eyes | pleated_skirt | white_background | white_shirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------|:--------|:---------|:----------|:------|:-------------|:----------|:--------|:--------|:--------|:--------------|:----------|:------------------|:-------------|:-----------------|:-------------|:------|:--------|:--------|:-----------------|:-------------------|:-------------------|:--------------------|:----------------------|:-------|:--------|:---------|:---------|:-------------|:-----------|:--------------------|:--------|:---------------|:--------------------|:---------------|:-----------|:---------------|:-----------|:----------------|:--------------|:-------------|:---------------|:-----------------|:---------|:---------|:------------------|:-----------------|:------|:---------------|:------------|:--------------|:---------------|:---------|:------|:-----------|:--------|:-----------|:--------|:--------|:---------|:---------|:---------------|:----------------------|:------------------|:-----------------|:--------------|:--------------------|:----------------|:-------------------|:--------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | | X | | | | | | | | | | | | | | X | | | | X | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 14 |  |  |  |  |  | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | | X | | | | | | X | | | | | | | | | | | | X | | | X | | X | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | | X | | | | | | | | | | | | | | X | | | | | | | X | | X | | | | | | | X | X | X | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | | X | | | | X | | X | X | | | | | | | X | | | | | | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 7 | 7 |  |  |  |  |  | | X | | | | | | X | | | | | | | | | | | | | | | X | | X | | | | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X |
|
shidowake/FreedomIntelligence_alpaca-gpt4-japanese_subset_split_5 | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 4863217.322740098
num_examples: 4997
download_size: 2510718
dataset_size: 4863217.322740098
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
thomasavare/italian-dataset-deepl-v3 | ---
dataset_info:
features:
- name: english
dtype: string
- name: italian
dtype: string
- name: Class
dtype: string
- name: Class_index
dtype: float64
splits:
- name: test
num_bytes: 61821
num_examples: 500
download_size: 22699
dataset_size: 61821
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
500 phrases translated from english to italian from waste-classification-v3 test split.
|
Salama1429/common_voice_Arabic_12.0_Augmented | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 14306290182.938
num_examples: 63546
- name: test
num_bytes: 316503630.559
num_examples: 10433
download_size: 12163898712
dataset_size: 14622793813.497
---
# Dataset Card for "common_voice_12.0_Augmented"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sh-zheng/SurfaceRoughness | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': RoughnessB
'1': RoughnessC
'2': RoughnessD
splits:
- name: train
num_bytes: 49679719.0
num_examples: 66
- name: validation
num_bytes: 17272712.0
num_examples: 9
- name: test
num_bytes: 24382239.0
num_examples: 15
download_size: 91342507
dataset_size: 91334670.0
---
# Dataset Card for "SurfaceRoughness"
### Dataset Summary
A collection of data representing surface roughness categories of B, C, and D according to ASCE 7-16 26.7.2
### Data Structure
An example looks like below:
```python
{'image': <PIL.PngImagePlugin.PngImageFile image mode=RGBA size=1041x639>,
'label': 0,}
```
### Data Split
| |train |validation | test |
|-------------|------:|---------:|------:|
|# of examples|66 |15 |9 | |
mohanrajanbalagan/Project_Risk | ---
language:
- en
license: unknown
---
|
DanyCT25/argilla | ---
dataset_info:
features:
- name: text
dtype: string
- name: inputs
struct:
- name: text
dtype: string
- name: prediction
list:
- name: label
dtype: string
- name: score
dtype: float64
- name: prediction_agent
dtype: string
- name: annotation
dtype: 'null'
- name: annotation_agent
dtype: 'null'
- name: vectors
dtype: 'null'
- name: multi_label
dtype: bool
- name: explanation
dtype: 'null'
- name: id
dtype: string
- name: metadata
struct:
- name: category
dtype: int64
- name: status
dtype: string
- name: event_timestamp
dtype: timestamp[us]
- name: metrics
dtype: 'null'
splits:
- name: train
num_bytes: 1445808
num_examples: 5001
download_size: 0
dataset_size: 1445808
---
# Dataset Card for "argilla"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/futatsuiwa_mamizou_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of futatsuiwa_mamizou/二ッ岩マミゾウ (Touhou)
This is the dataset of futatsuiwa_mamizou/二ッ岩マミゾウ (Touhou), containing 500 images and their tags.
The core tags of this character are `brown_hair, animal_ears, glasses, raccoon_ears, leaf_on_head, short_hair, raccoon_tail, tail, brown_eyes, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 511.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/futatsuiwa_mamizou_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 334.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/futatsuiwa_mamizou_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1084 | 643.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/futatsuiwa_mamizou_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 465.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/futatsuiwa_mamizou_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1084 | 831.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/futatsuiwa_mamizou_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/futatsuiwa_mamizou_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, leaf, solo, pince-nez, skirt, smile, bloomers, notepad, bottle, one_eye_closed, sandals, chibi, open_mouth |
| 1 | 8 |  |  |  |  |  | 1girl, leaf, sandals, skirt, solo, smile, pince-nez, sitting |
| 2 | 13 |  |  |  |  |  | 1girl, leaf, smile, solo, bell, hat, kiseru, gourd, skirt, notepad, pince-nez, clog_sandals, sitting |
| 3 | 16 |  |  |  |  |  | 1girl, brown_shirt, leaf, solo, closed_mouth, brown_skirt, raccoon_girl, short_sleeves, simple_background, smile, bangs, holding_smoking_pipe, kiseru, looking_at_viewer, full_body, white_background, :3, bell, hat, round_eyewear, sandals, sitting |
| 4 | 27 |  |  |  |  |  | leaf, 1girl, solo, bangs, green_kimono, looking_at_viewer, long_sleeves, smile, checkered_scarf, raccoon_girl, wide_sleeves, closed_mouth, :3, kiseru, haori, holding_smoking_pipe, one-hour_drawing_challenge, round_eyewear, smoke |
| 5 | 5 |  |  |  |  |  | 1girl, blush, large_breasts, leaf, looking_at_viewer, nipples, nude, solo, pince-nez, smile, barefoot, simple_background, lying, pussy, white_background |
| 6 | 12 |  |  |  |  |  | 1boy, 1girl, blush, hetero, leaf, solo_focus, nipples, penis, large_breasts, sex, vaginal, bar_censor, female_pubic_hair, nude, open_mouth, smile, cum_in_pussy, navel |
| 7 | 7 |  |  |  |  |  | 1girl, leaf, office_lady, solo, pencil_skirt, smile, black_jacket, black_skirt, brown_pantyhose, large_breasts, long_sleeves, looking_at_viewer, looking_over_eyewear, skirt_suit, sunglasses, white_shirt, alternate_costume, bangs, black_footwear, black_pantyhose, collared_shirt, crossed_legs, holding, sitting |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | leaf | solo | pince-nez | skirt | smile | bloomers | notepad | bottle | one_eye_closed | sandals | chibi | open_mouth | sitting | bell | hat | kiseru | gourd | clog_sandals | brown_shirt | closed_mouth | brown_skirt | raccoon_girl | short_sleeves | simple_background | bangs | holding_smoking_pipe | looking_at_viewer | full_body | white_background | :3 | round_eyewear | green_kimono | long_sleeves | checkered_scarf | wide_sleeves | haori | one-hour_drawing_challenge | smoke | blush | large_breasts | nipples | nude | barefoot | lying | pussy | 1boy | hetero | solo_focus | penis | sex | vaginal | bar_censor | female_pubic_hair | cum_in_pussy | navel | office_lady | pencil_skirt | black_jacket | black_skirt | brown_pantyhose | looking_over_eyewear | skirt_suit | sunglasses | white_shirt | alternate_costume | black_footwear | black_pantyhose | collared_shirt | crossed_legs | holding |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-------|:------------|:--------|:--------|:-----------|:----------|:---------|:-----------------|:----------|:--------|:-------------|:----------|:-------|:------|:---------|:--------|:---------------|:--------------|:---------------|:--------------|:---------------|:----------------|:--------------------|:--------|:-----------------------|:--------------------|:------------|:-------------------|:-----|:----------------|:---------------|:---------------|:------------------|:---------------|:--------|:-----------------------------|:--------|:--------|:----------------|:----------|:-------|:-----------|:--------|:--------|:-------|:---------|:-------------|:--------|:------|:----------|:-------------|:--------------------|:---------------|:--------|:--------------|:---------------|:---------------|:--------------|:------------------|:-----------------------|:-------------|:-------------|:--------------|:--------------------|:-----------------|:------------------|:-----------------|:---------------|:----------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | X | X | X | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 13 |  |  |  |  |  | X | X | X | X | X | X | | X | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 16 |  |  |  |  |  | X | X | X | | | X | | | | | X | | | X | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 27 |  |  |  |  |  | X | X | X | | | X | | | | | | | | | | | X | | | | X | | X | | | X | X | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | | | | | | | | | | | X | | | X | | X | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 12 |  |  |  |  |  | X | X | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 7 | 7 |  |  |  |  |  | X | X | X | | | X | | | | | | | | X | | | | | | | | | | | | X | | X | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
OMotta/Kyron | ---
license: openrail
---
|
mediabiasgroup/anno-lexical | ---
license: cc-by-nc-nd-4.0
dataset_info:
config_name: plain_text
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: source_party
dtype: string
- name: source_name
dtype: string
- name: sentence_id
dtype: string
splits:
- name: train
- name: dev
- name: test
configs:
- config_name: default
data_files:
- split: train
path: "anno-lexical-train.parquet"
- split: dev
path: "anno-lexical-dev.parquet"
- split: test
path: "anno-lexical-test.parquet"
---
|
CyberHarem/z28_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of z28/Z28 (Azur Lane)
This is the dataset of z28/Z28 (Azur Lane), containing 18 images and their tags.
The core tags of this character are `blue_eyes, breasts, ahoge, grey_hair, hat, large_breasts, short_hair, white_hair, beret`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 18 | 30.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/z28_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 18 | 14.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/z28_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 43 | 28.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/z28_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 18 | 24.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/z28_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 43 | 46.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/z28_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/z28_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, solo, looking_at_viewer, open_mouth, bird, simple_background, blush, sideboob, black_headwear, collar, smile, thighhighs, dress, fang, medium_breasts, upper_body, white_background |
| 1 | 5 |  |  |  |  |  | 1girl, blush, green_dress, hairband, solo, hair_flower, looking_at_viewer, grass, green_choker, open_mouth, sitting, white_gloves, :d, alcohol, bangs, barefoot, cleavage, collarbone, food, holding_cup, outdoors, see-through, two_side_up, wine_glass |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | open_mouth | bird | simple_background | blush | sideboob | black_headwear | collar | smile | thighhighs | dress | fang | medium_breasts | upper_body | white_background | green_dress | hairband | hair_flower | grass | green_choker | sitting | white_gloves | :d | alcohol | bangs | barefoot | cleavage | collarbone | food | holding_cup | outdoors | see-through | two_side_up | wine_glass |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:-------------|:-------|:--------------------|:--------|:-----------|:-----------------|:---------|:--------|:-------------|:--------|:-------|:-----------------|:-------------|:-------------------|:--------------|:-----------|:--------------|:--------|:---------------|:----------|:---------------|:-----|:----------|:--------|:-----------|:-----------|:-------------|:-------|:--------------|:-----------|:--------------|:--------------|:-------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
naonao0715/lima_PairRM | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: question
dtype: string
- name: output
dtype: string
- name: candidates_texts
dtype: string
- name: ranks
dtype: string
splits:
- name: train
num_bytes: 536439
num_examples: 50
download_size: 250561
dataset_size: 536439
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
multi-train/trex-train-multikilt_1107 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: query
dtype: string
- name: pos
sequence: string
- name: neg
sequence: string
- name: task
dtype: string
- name: instruction
struct:
- name: query
dtype: string
- name: pos
dtype: string
- name: neg
dtype: string
splits:
- name: train
num_bytes: 228887845
num_examples: 200000
download_size: 116247120
dataset_size: 228887845
---
# Dataset Card for "trex-train-multikilt_1107"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fuyu-quant/ibl-regression-ver4-branch | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: index
dtype: int64
- name: category
dtype: string
splits:
- name: train
num_bytes: 25467924
num_examples: 10000
- name: test
num_bytes: 2546809
num_examples: 1000
download_size: 13611510
dataset_size: 28014733
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
c3po-ai/edgar-corpus | ---
dataset_info:
- config_name: .
features:
- name: filename
dtype: string
- name: cik
dtype: string
- name: year
dtype: string
- name: section_1
dtype: string
- name: section_1A
dtype: string
- name: section_1B
dtype: string
- name: section_2
dtype: string
- name: section_3
dtype: string
- name: section_4
dtype: string
- name: section_5
dtype: string
- name: section_6
dtype: string
- name: section_7
dtype: string
- name: section_7A
dtype: string
- name: section_8
dtype: string
- name: section_9
dtype: string
- name: section_9A
dtype: string
- name: section_9B
dtype: string
- name: section_10
dtype: string
- name: section_11
dtype: string
- name: section_12
dtype: string
- name: section_13
dtype: string
- name: section_14
dtype: string
- name: section_15
dtype: string
splits:
- name: train
num_bytes: 40306320885
num_examples: 220375
download_size: 10734208660
dataset_size: 40306320885
- config_name: full
features:
- name: filename
dtype: string
- name: cik
dtype: string
- name: year
dtype: string
- name: section_1
dtype: string
- name: section_1A
dtype: string
- name: section_1B
dtype: string
- name: section_2
dtype: string
- name: section_3
dtype: string
- name: section_4
dtype: string
- name: section_5
dtype: string
- name: section_6
dtype: string
- name: section_7
dtype: string
- name: section_7A
dtype: string
- name: section_8
dtype: string
- name: section_9
dtype: string
- name: section_9A
dtype: string
- name: section_9B
dtype: string
- name: section_10
dtype: string
- name: section_11
dtype: string
- name: section_12
dtype: string
- name: section_13
dtype: string
- name: section_14
dtype: string
- name: section_15
dtype: string
splits:
- name: train
num_bytes: 32237457024
num_examples: 176289
- name: validation
num_bytes: 4023129683
num_examples: 22050
- name: test
num_bytes: 4045734178
num_examples: 22036
download_size: 40699852536
dataset_size: 40306320885
- config_name: year_1993
features:
- name: filename
dtype: string
- name: cik
dtype: string
- name: year
dtype: string
- name: section_1
dtype: string
- name: section_1A
dtype: string
- name: section_1B
dtype: string
- name: section_2
dtype: string
- name: section_3
dtype: string
- name: section_4
dtype: string
- name: section_5
dtype: string
- name: section_6
dtype: string
- name: section_7
dtype: string
- name: section_7A
dtype: string
- name: section_8
dtype: string
- name: section_9
dtype: string
- name: section_9A
dtype: string
- name: section_9B
dtype: string
- name: section_10
dtype: string
- name: section_11
dtype: string
- name: section_12
dtype: string
- name: section_13
dtype: string
- name: section_14
dtype: string
- name: section_15
dtype: string
splits:
- name: train
num_bytes: 112714537
num_examples: 1060
- name: validation
num_bytes: 13584432
num_examples: 133
- name: test
num_bytes: 14520566
num_examples: 133
download_size: 141862572
dataset_size: 140819535
- config_name: year_1994
features:
- name: filename
dtype: string
- name: cik
dtype: string
- name: year
dtype: string
- name: section_1
dtype: string
- name: section_1A
dtype: string
- name: section_1B
dtype: string
- name: section_2
dtype: string
- name: section_3
dtype: string
- name: section_4
dtype: string
- name: section_5
dtype: string
- name: section_6
dtype: string
- name: section_7
dtype: string
- name: section_7A
dtype: string
- name: section_8
dtype: string
- name: section_9
dtype: string
- name: section_9A
dtype: string
- name: section_9B
dtype: string
- name: section_10
dtype: string
- name: section_11
dtype: string
- name: section_12
dtype: string
- name: section_13
dtype: string
- name: section_14
dtype: string
- name: section_15
dtype: string
splits:
- name: train
num_bytes: 198955093
num_examples: 2083
- name: validation
num_bytes: 23432307
num_examples: 261
- name: test
num_bytes: 26115768
num_examples: 260
download_size: 250411041
dataset_size: 248503168
- config_name: year_1995
features:
- name: filename
dtype: string
- name: cik
dtype: string
- name: year
dtype: string
- name: section_1
dtype: string
- name: section_1A
dtype: string
- name: section_1B
dtype: string
- name: section_2
dtype: string
- name: section_3
dtype: string
- name: section_4
dtype: string
- name: section_5
dtype: string
- name: section_6
dtype: string
- name: section_7
dtype: string
- name: section_7A
dtype: string
- name: section_8
dtype: string
- name: section_9
dtype: string
- name: section_9A
dtype: string
- name: section_9B
dtype: string
- name: section_10
dtype: string
- name: section_11
dtype: string
- name: section_12
dtype: string
- name: section_13
dtype: string
- name: section_14
dtype: string
- name: section_15
dtype: string
splits:
- name: train
num_bytes: 356959049
num_examples: 4110
- name: validation
num_bytes: 42781161
num_examples: 514
- name: test
num_bytes: 45275568
num_examples: 514
download_size: 448617549
dataset_size: 445015778
- config_name: year_1996
features:
- name: filename
dtype: string
- name: cik
dtype: string
- name: year
dtype: string
- name: section_1
dtype: string
- name: section_1A
dtype: string
- name: section_1B
dtype: string
- name: section_2
dtype: string
- name: section_3
dtype: string
- name: section_4
dtype: string
- name: section_5
dtype: string
- name: section_6
dtype: string
- name: section_7
dtype: string
- name: section_7A
dtype: string
- name: section_8
dtype: string
- name: section_9
dtype: string
- name: section_9A
dtype: string
- name: section_9B
dtype: string
- name: section_10
dtype: string
- name: section_11
dtype: string
- name: section_12
dtype: string
- name: section_13
dtype: string
- name: section_14
dtype: string
- name: section_15
dtype: string
splits:
- name: train
num_bytes: 738506135
num_examples: 7589
- name: validation
num_bytes: 89873905
num_examples: 949
- name: test
num_bytes: 91248882
num_examples: 949
download_size: 926536700
dataset_size: 919628922
- config_name: year_1997
features:
- name: filename
dtype: string
- name: cik
dtype: string
- name: year
dtype: string
- name: section_1
dtype: string
- name: section_1A
dtype: string
- name: section_1B
dtype: string
- name: section_2
dtype: string
- name: section_3
dtype: string
- name: section_4
dtype: string
- name: section_5
dtype: string
- name: section_6
dtype: string
- name: section_7
dtype: string
- name: section_7A
dtype: string
- name: section_8
dtype: string
- name: section_9
dtype: string
- name: section_9A
dtype: string
- name: section_9B
dtype: string
- name: section_10
dtype: string
- name: section_11
dtype: string
- name: section_12
dtype: string
- name: section_13
dtype: string
- name: section_14
dtype: string
- name: section_15
dtype: string
splits:
- name: train
num_bytes: 854201733
num_examples: 8084
- name: validation
num_bytes: 103167272
num_examples: 1011
- name: test
num_bytes: 106843950
num_examples: 1011
download_size: 1071898139
dataset_size: 1064212955
- config_name: year_1998
features:
- name: filename
dtype: string
- name: cik
dtype: string
- name: year
dtype: string
- name: section_1
dtype: string
- name: section_1A
dtype: string
- name: section_1B
dtype: string
- name: section_2
dtype: string
- name: section_3
dtype: string
- name: section_4
dtype: string
- name: section_5
dtype: string
- name: section_6
dtype: string
- name: section_7
dtype: string
- name: section_7A
dtype: string
- name: section_8
dtype: string
- name: section_9
dtype: string
- name: section_9A
dtype: string
- name: section_9B
dtype: string
- name: section_10
dtype: string
- name: section_11
dtype: string
- name: section_12
dtype: string
- name: section_13
dtype: string
- name: section_14
dtype: string
- name: section_15
dtype: string
splits:
- name: train
num_bytes: 904075497
num_examples: 8040
- name: validation
num_bytes: 112630658
num_examples: 1006
- name: test
num_bytes: 113308750
num_examples: 1005
download_size: 1137887615
dataset_size: 1130014905
- config_name: year_1999
features:
- name: filename
dtype: string
- name: cik
dtype: string
- name: year
dtype: string
- name: section_1
dtype: string
- name: section_1A
dtype: string
- name: section_1B
dtype: string
- name: section_2
dtype: string
- name: section_3
dtype: string
- name: section_4
dtype: string
- name: section_5
dtype: string
- name: section_6
dtype: string
- name: section_7
dtype: string
- name: section_7A
dtype: string
- name: section_8
dtype: string
- name: section_9
dtype: string
- name: section_9A
dtype: string
- name: section_9B
dtype: string
- name: section_10
dtype: string
- name: section_11
dtype: string
- name: section_12
dtype: string
- name: section_13
dtype: string
- name: section_14
dtype: string
- name: section_15
dtype: string
splits:
- name: train
num_bytes: 911374885
num_examples: 7864
- name: validation
num_bytes: 118614261
num_examples: 984
- name: test
num_bytes: 116706581
num_examples: 983
download_size: 1154736765
dataset_size: 1146695727
- config_name: year_2000
features:
- name: filename
dtype: string
- name: cik
dtype: string
- name: year
dtype: string
- name: section_1
dtype: string
- name: section_1A
dtype: string
- name: section_1B
dtype: string
- name: section_2
dtype: string
- name: section_3
dtype: string
- name: section_4
dtype: string
- name: section_5
dtype: string
- name: section_6
dtype: string
- name: section_7
dtype: string
- name: section_7A
dtype: string
- name: section_8
dtype: string
- name: section_9
dtype: string
- name: section_9A
dtype: string
- name: section_9B
dtype: string
- name: section_10
dtype: string
- name: section_11
dtype: string
- name: section_12
dtype: string
- name: section_13
dtype: string
- name: section_14
dtype: string
- name: section_15
dtype: string
splits:
- name: train
num_bytes: 926444625
num_examples: 7589
- name: validation
num_bytes: 113264749
num_examples: 949
- name: test
num_bytes: 114605470
num_examples: 949
download_size: 1162526814
dataset_size: 1154314844
- config_name: year_2001
features:
- name: filename
dtype: string
- name: cik
dtype: string
- name: year
dtype: string
- name: section_1
dtype: string
- name: section_1A
dtype: string
- name: section_1B
dtype: string
- name: section_2
dtype: string
- name: section_3
dtype: string
- name: section_4
dtype: string
- name: section_5
dtype: string
- name: section_6
dtype: string
- name: section_7
dtype: string
- name: section_7A
dtype: string
- name: section_8
dtype: string
- name: section_9
dtype: string
- name: section_9A
dtype: string
- name: section_9B
dtype: string
- name: section_10
dtype: string
- name: section_11
dtype: string
- name: section_12
dtype: string
- name: section_13
dtype: string
- name: section_14
dtype: string
- name: section_15
dtype: string
splits:
- name: train
num_bytes: 964631161
num_examples: 7181
- name: validation
num_bytes: 117509010
num_examples: 898
- name: test
num_bytes: 116141097
num_examples: 898
download_size: 1207790205
dataset_size: 1198281268
- config_name: year_2002
features:
- name: filename
dtype: string
- name: cik
dtype: string
- name: year
dtype: string
- name: section_1
dtype: string
- name: section_1A
dtype: string
- name: section_1B
dtype: string
- name: section_2
dtype: string
- name: section_3
dtype: string
- name: section_4
dtype: string
- name: section_5
dtype: string
- name: section_6
dtype: string
- name: section_7
dtype: string
- name: section_7A
dtype: string
- name: section_8
dtype: string
- name: section_9
dtype: string
- name: section_9A
dtype: string
- name: section_9B
dtype: string
- name: section_10
dtype: string
- name: section_11
dtype: string
- name: section_12
dtype: string
- name: section_13
dtype: string
- name: section_14
dtype: string
- name: section_15
dtype: string
splits:
- name: train
num_bytes: 1049271720
num_examples: 6636
- name: validation
num_bytes: 128339491
num_examples: 830
- name: test
num_bytes: 128444184
num_examples: 829
download_size: 1317817728
dataset_size: 1306055395
- config_name: year_2003
features:
- name: filename
dtype: string
- name: cik
dtype: string
- name: year
dtype: string
- name: section_1
dtype: string
- name: section_1A
dtype: string
- name: section_1B
dtype: string
- name: section_2
dtype: string
- name: section_3
dtype: string
- name: section_4
dtype: string
- name: section_5
dtype: string
- name: section_6
dtype: string
- name: section_7
dtype: string
- name: section_7A
dtype: string
- name: section_8
dtype: string
- name: section_9
dtype: string
- name: section_9A
dtype: string
- name: section_9B
dtype: string
- name: section_10
dtype: string
- name: section_11
dtype: string
- name: section_12
dtype: string
- name: section_13
dtype: string
- name: section_14
dtype: string
- name: section_15
dtype: string
splits:
- name: train
num_bytes: 1027557690
num_examples: 6672
- name: validation
num_bytes: 126684704
num_examples: 834
- name: test
num_bytes: 130672979
num_examples: 834
download_size: 1297227566
dataset_size: 1284915373
- config_name: year_2004
features:
- name: filename
dtype: string
- name: cik
dtype: string
- name: year
dtype: string
- name: section_1
dtype: string
- name: section_1A
dtype: string
- name: section_1B
dtype: string
- name: section_2
dtype: string
- name: section_3
dtype: string
- name: section_4
dtype: string
- name: section_5
dtype: string
- name: section_6
dtype: string
- name: section_7
dtype: string
- name: section_7A
dtype: string
- name: section_8
dtype: string
- name: section_9
dtype: string
- name: section_9A
dtype: string
- name: section_9B
dtype: string
- name: section_10
dtype: string
- name: section_11
dtype: string
- name: section_12
dtype: string
- name: section_13
dtype: string
- name: section_14
dtype: string
- name: section_15
dtype: string
splits:
- name: train
num_bytes: 1129657843
num_examples: 7111
- name: validation
num_bytes: 147499772
num_examples: 889
- name: test
num_bytes: 147890092
num_examples: 889
download_size: 1439663100
dataset_size: 1425047707
- config_name: year_2005
features:
- name: filename
dtype: string
- name: cik
dtype: string
- name: year
dtype: string
- name: section_1
dtype: string
- name: section_1A
dtype: string
- name: section_1B
dtype: string
- name: section_2
dtype: string
- name: section_3
dtype: string
- name: section_4
dtype: string
- name: section_5
dtype: string
- name: section_6
dtype: string
- name: section_7
dtype: string
- name: section_7A
dtype: string
- name: section_8
dtype: string
- name: section_9
dtype: string
- name: section_9A
dtype: string
- name: section_9B
dtype: string
- name: section_10
dtype: string
- name: section_11
dtype: string
- name: section_12
dtype: string
- name: section_13
dtype: string
- name: section_14
dtype: string
- name: section_15
dtype: string
splits:
- name: train
num_bytes: 1200714441
num_examples: 7113
- name: validation
num_bytes: 161003977
num_examples: 890
- name: test
num_bytes: 160727195
num_examples: 889
download_size: 1538876195
dataset_size: 1522445613
- config_name: year_2006
features:
- name: filename
dtype: string
- name: cik
dtype: string
- name: year
dtype: string
- name: section_1
dtype: string
- name: section_1A
dtype: string
- name: section_1B
dtype: string
- name: section_2
dtype: string
- name: section_3
dtype: string
- name: section_4
dtype: string
- name: section_5
dtype: string
- name: section_6
dtype: string
- name: section_7
dtype: string
- name: section_7A
dtype: string
- name: section_8
dtype: string
- name: section_9
dtype: string
- name: section_9A
dtype: string
- name: section_9B
dtype: string
- name: section_10
dtype: string
- name: section_11
dtype: string
- name: section_12
dtype: string
- name: section_13
dtype: string
- name: section_14
dtype: string
- name: section_15
dtype: string
splits:
- name: train
num_bytes: 1286566049
num_examples: 7064
- name: validation
num_bytes: 160843494
num_examples: 883
- name: test
num_bytes: 163270601
num_examples: 883
download_size: 1628452618
dataset_size: 1610680144
- config_name: year_2007
features:
- name: filename
dtype: string
- name: cik
dtype: string
- name: year
dtype: string
- name: section_1
dtype: string
- name: section_1A
dtype: string
- name: section_1B
dtype: string
- name: section_2
dtype: string
- name: section_3
dtype: string
- name: section_4
dtype: string
- name: section_5
dtype: string
- name: section_6
dtype: string
- name: section_7
dtype: string
- name: section_7A
dtype: string
- name: section_8
dtype: string
- name: section_9
dtype: string
- name: section_9A
dtype: string
- name: section_9B
dtype: string
- name: section_10
dtype: string
- name: section_11
dtype: string
- name: section_12
dtype: string
- name: section_13
dtype: string
- name: section_14
dtype: string
- name: section_15
dtype: string
splits:
- name: train
num_bytes: 1296737173
num_examples: 6683
- name: validation
num_bytes: 166735560
num_examples: 836
- name: test
num_bytes: 156399535
num_examples: 835
download_size: 1637502176
dataset_size: 1619872268
- config_name: year_2008
features:
- name: filename
dtype: string
- name: cik
dtype: string
- name: year
dtype: string
- name: section_1
dtype: string
- name: section_1A
dtype: string
- name: section_1B
dtype: string
- name: section_2
dtype: string
- name: section_3
dtype: string
- name: section_4
dtype: string
- name: section_5
dtype: string
- name: section_6
dtype: string
- name: section_7
dtype: string
- name: section_7A
dtype: string
- name: section_8
dtype: string
- name: section_9
dtype: string
- name: section_9A
dtype: string
- name: section_9B
dtype: string
- name: section_10
dtype: string
- name: section_11
dtype: string
- name: section_12
dtype: string
- name: section_13
dtype: string
- name: section_14
dtype: string
- name: section_15
dtype: string
splits:
- name: train
num_bytes: 1525698198
num_examples: 7408
- name: validation
num_bytes: 190034435
num_examples: 927
- name: test
num_bytes: 187659976
num_examples: 926
download_size: 1924164839
dataset_size: 1903392609
- config_name: year_2009
features:
- name: filename
dtype: string
- name: cik
dtype: string
- name: year
dtype: string
- name: section_1
dtype: string
- name: section_1A
dtype: string
- name: section_1B
dtype: string
- name: section_2
dtype: string
- name: section_3
dtype: string
- name: section_4
dtype: string
- name: section_5
dtype: string
- name: section_6
dtype: string
- name: section_7
dtype: string
- name: section_7A
dtype: string
- name: section_8
dtype: string
- name: section_9
dtype: string
- name: section_9A
dtype: string
- name: section_9B
dtype: string
- name: section_10
dtype: string
- name: section_11
dtype: string
- name: section_12
dtype: string
- name: section_13
dtype: string
- name: section_14
dtype: string
- name: section_15
dtype: string
splits:
- name: train
num_bytes: 1547816260
num_examples: 7336
- name: validation
num_bytes: 188897783
num_examples: 917
- name: test
num_bytes: 196463897
num_examples: 917
download_size: 1954076983
dataset_size: 1933177940
- config_name: year_2010
features:
- name: filename
dtype: string
- name: cik
dtype: string
- name: year
dtype: string
- name: section_1
dtype: string
- name: section_1A
dtype: string
- name: section_1B
dtype: string
- name: section_2
dtype: string
- name: section_3
dtype: string
- name: section_4
dtype: string
- name: section_5
dtype: string
- name: section_6
dtype: string
- name: section_7
dtype: string
- name: section_7A
dtype: string
- name: section_8
dtype: string
- name: section_9
dtype: string
- name: section_9A
dtype: string
- name: section_9B
dtype: string
- name: section_10
dtype: string
- name: section_11
dtype: string
- name: section_12
dtype: string
- name: section_13
dtype: string
- name: section_14
dtype: string
- name: section_15
dtype: string
splits:
- name: train
num_bytes: 1493505900
num_examples: 7013
- name: validation
num_bytes: 192695567
num_examples: 877
- name: test
num_bytes: 191482640
num_examples: 877
download_size: 1897687327
dataset_size: 1877684107
- config_name: year_2011
features:
- name: filename
dtype: string
- name: cik
dtype: string
- name: year
dtype: string
- name: section_1
dtype: string
- name: section_1A
dtype: string
- name: section_1B
dtype: string
- name: section_2
dtype: string
- name: section_3
dtype: string
- name: section_4
dtype: string
- name: section_5
dtype: string
- name: section_6
dtype: string
- name: section_7
dtype: string
- name: section_7A
dtype: string
- name: section_8
dtype: string
- name: section_9
dtype: string
- name: section_9A
dtype: string
- name: section_9B
dtype: string
- name: section_10
dtype: string
- name: section_11
dtype: string
- name: section_12
dtype: string
- name: section_13
dtype: string
- name: section_14
dtype: string
- name: section_15
dtype: string
splits:
- name: train
num_bytes: 1481486551
num_examples: 6724
- name: validation
num_bytes: 190781558
num_examples: 841
- name: test
num_bytes: 185869151
num_examples: 840
download_size: 1877396421
dataset_size: 1858137260
- config_name: year_2012
features:
- name: filename
dtype: string
- name: cik
dtype: string
- name: year
dtype: string
- name: section_1
dtype: string
- name: section_1A
dtype: string
- name: section_1B
dtype: string
- name: section_2
dtype: string
- name: section_3
dtype: string
- name: section_4
dtype: string
- name: section_5
dtype: string
- name: section_6
dtype: string
- name: section_7
dtype: string
- name: section_7A
dtype: string
- name: section_8
dtype: string
- name: section_9
dtype: string
- name: section_9A
dtype: string
- name: section_9B
dtype: string
- name: section_10
dtype: string
- name: section_11
dtype: string
- name: section_12
dtype: string
- name: section_13
dtype: string
- name: section_14
dtype: string
- name: section_15
dtype: string
splits:
- name: train
num_bytes: 1463496224
num_examples: 6479
- name: validation
num_bytes: 186247306
num_examples: 810
- name: test
num_bytes: 185923601
num_examples: 810
download_size: 1854377191
dataset_size: 1835667131
- config_name: year_2013
features:
- name: filename
dtype: string
- name: cik
dtype: string
- name: year
dtype: string
- name: section_1
dtype: string
- name: section_1A
dtype: string
- name: section_1B
dtype: string
- name: section_2
dtype: string
- name: section_3
dtype: string
- name: section_4
dtype: string
- name: section_5
dtype: string
- name: section_6
dtype: string
- name: section_7
dtype: string
- name: section_7A
dtype: string
- name: section_8
dtype: string
- name: section_9
dtype: string
- name: section_9A
dtype: string
- name: section_9B
dtype: string
- name: section_10
dtype: string
- name: section_11
dtype: string
- name: section_12
dtype: string
- name: section_13
dtype: string
- name: section_14
dtype: string
- name: section_15
dtype: string
splits:
- name: train
num_bytes: 1468172419
num_examples: 6372
- name: validation
num_bytes: 183570866
num_examples: 797
- name: test
num_bytes: 182495750
num_examples: 796
download_size: 1852839009
dataset_size: 1834239035
- config_name: year_2014
features:
- name: filename
dtype: string
- name: cik
dtype: string
- name: year
dtype: string
- name: section_1
dtype: string
- name: section_1A
dtype: string
- name: section_1B
dtype: string
- name: section_2
dtype: string
- name: section_3
dtype: string
- name: section_4
dtype: string
- name: section_5
dtype: string
- name: section_6
dtype: string
- name: section_7
dtype: string
- name: section_7A
dtype: string
- name: section_8
dtype: string
- name: section_9
dtype: string
- name: section_9A
dtype: string
- name: section_9B
dtype: string
- name: section_10
dtype: string
- name: section_11
dtype: string
- name: section_12
dtype: string
- name: section_13
dtype: string
- name: section_14
dtype: string
- name: section_15
dtype: string
splits:
- name: train
num_bytes: 1499451593
num_examples: 6261
- name: validation
num_bytes: 181568907
num_examples: 783
- name: test
num_bytes: 181046535
num_examples: 783
download_size: 1880963095
dataset_size: 1862067035
- config_name: year_2015
features:
- name: filename
dtype: string
- name: cik
dtype: string
- name: year
dtype: string
- name: section_1
dtype: string
- name: section_1A
dtype: string
- name: section_1B
dtype: string
- name: section_2
dtype: string
- name: section_3
dtype: string
- name: section_4
dtype: string
- name: section_5
dtype: string
- name: section_6
dtype: string
- name: section_7
dtype: string
- name: section_7A
dtype: string
- name: section_8
dtype: string
- name: section_9
dtype: string
- name: section_9A
dtype: string
- name: section_9B
dtype: string
- name: section_10
dtype: string
- name: section_11
dtype: string
- name: section_12
dtype: string
- name: section_13
dtype: string
- name: section_14
dtype: string
- name: section_15
dtype: string
splits:
- name: train
num_bytes: 1472346721
num_examples: 6028
- name: validation
num_bytes: 180128910
num_examples: 754
- name: test
num_bytes: 189210252
num_examples: 753
download_size: 1860303134
dataset_size: 1841685883
- config_name: year_2016
features:
- name: filename
dtype: string
- name: cik
dtype: string
- name: year
dtype: string
- name: section_1
dtype: string
- name: section_1A
dtype: string
- name: section_1B
dtype: string
- name: section_2
dtype: string
- name: section_3
dtype: string
- name: section_4
dtype: string
- name: section_5
dtype: string
- name: section_6
dtype: string
- name: section_7
dtype: string
- name: section_7A
dtype: string
- name: section_8
dtype: string
- name: section_9
dtype: string
- name: section_9A
dtype: string
- name: section_9B
dtype: string
- name: section_10
dtype: string
- name: section_11
dtype: string
- name: section_12
dtype: string
- name: section_13
dtype: string
- name: section_14
dtype: string
- name: section_15
dtype: string
splits:
- name: train
num_bytes: 1471605426
num_examples: 5812
- name: validation
num_bytes: 178310005
num_examples: 727
- name: test
num_bytes: 177481471
num_examples: 727
download_size: 1845967492
dataset_size: 1827396902
- config_name: year_2017
features:
- name: filename
dtype: string
- name: cik
dtype: string
- name: year
dtype: string
- name: section_1
dtype: string
- name: section_1A
dtype: string
- name: section_1B
dtype: string
- name: section_2
dtype: string
- name: section_3
dtype: string
- name: section_4
dtype: string
- name: section_5
dtype: string
- name: section_6
dtype: string
- name: section_7
dtype: string
- name: section_7A
dtype: string
- name: section_8
dtype: string
- name: section_9
dtype: string
- name: section_9A
dtype: string
- name: section_9B
dtype: string
- name: section_10
dtype: string
- name: section_11
dtype: string
- name: section_12
dtype: string
- name: section_13
dtype: string
- name: section_14
dtype: string
- name: section_15
dtype: string
splits:
- name: train
num_bytes: 1459021126
num_examples: 5635
- name: validation
num_bytes: 174360913
num_examples: 705
- name: test
num_bytes: 184398250
num_examples: 704
download_size: 1836306408
dataset_size: 1817780289
- config_name: year_2018
features:
- name: filename
dtype: string
- name: cik
dtype: string
- name: year
dtype: string
- name: section_1
dtype: string
- name: section_1A
dtype: string
- name: section_1B
dtype: string
- name: section_2
dtype: string
- name: section_3
dtype: string
- name: section_4
dtype: string
- name: section_5
dtype: string
- name: section_6
dtype: string
- name: section_7
dtype: string
- name: section_7A
dtype: string
- name: section_8
dtype: string
- name: section_9
dtype: string
- name: section_9A
dtype: string
- name: section_9B
dtype: string
- name: section_10
dtype: string
- name: section_11
dtype: string
- name: section_12
dtype: string
- name: section_13
dtype: string
- name: section_14
dtype: string
- name: section_15
dtype: string
splits:
- name: train
num_bytes: 1433409319
num_examples: 5508
- name: validation
num_bytes: 181466460
num_examples: 689
- name: test
num_bytes: 182594965
num_examples: 688
download_size: 1815810567
dataset_size: 1797470744
- config_name: year_2019
features:
- name: filename
dtype: string
- name: cik
dtype: string
- name: year
dtype: string
- name: section_1
dtype: string
- name: section_1A
dtype: string
- name: section_1B
dtype: string
- name: section_2
dtype: string
- name: section_3
dtype: string
- name: section_4
dtype: string
- name: section_5
dtype: string
- name: section_6
dtype: string
- name: section_7
dtype: string
- name: section_7A
dtype: string
- name: section_8
dtype: string
- name: section_9
dtype: string
- name: section_9A
dtype: string
- name: section_9B
dtype: string
- name: section_10
dtype: string
- name: section_11
dtype: string
- name: section_12
dtype: string
- name: section_13
dtype: string
- name: section_14
dtype: string
- name: section_15
dtype: string
splits:
- name: train
num_bytes: 1421232269
num_examples: 5354
- name: validation
num_bytes: 175603562
num_examples: 670
- name: test
num_bytes: 176336174
num_examples: 669
download_size: 1791237155
dataset_size: 1773172005
- config_name: year_2020
features:
- name: filename
dtype: string
- name: cik
dtype: string
- name: year
dtype: string
- name: section_1
dtype: string
- name: section_1A
dtype: string
- name: section_1B
dtype: string
- name: section_2
dtype: string
- name: section_3
dtype: string
- name: section_4
dtype: string
- name: section_5
dtype: string
- name: section_6
dtype: string
- name: section_7
dtype: string
- name: section_7A
dtype: string
- name: section_8
dtype: string
- name: section_9
dtype: string
- name: section_9A
dtype: string
- name: section_9B
dtype: string
- name: section_10
dtype: string
- name: section_11
dtype: string
- name: section_12
dtype: string
- name: section_13
dtype: string
- name: section_14
dtype: string
- name: section_15
dtype: string
splits:
- name: train
num_bytes: 1541847387
num_examples: 5480
- name: validation
num_bytes: 193498658
num_examples: 686
- name: test
num_bytes: 192600298
num_examples: 685
download_size: 1946916132
dataset_size: 1927946343
annotations_creators:
- no-annotation
language:
- en
language_creators:
- other
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: EDGAR-CORPUS (10-K Filings from 1999 to 2020)
size_categories:
- 100K<n<1M
source_datasets:
- extended|other
tags:
- research papers
- edgar
- sec
- finance
- financial
- filings
- 10K
- 10-K
- nlp
- research
- econlp
- economics
- business
task_categories:
- other
task_ids: []
duplicated_from: eloukas/edgar-corpus
---
# Dataset Card for [EDGAR-CORPUS]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [References](#references)
- [Contributions](#contributions)
## Dataset Description
- **Point of Contact: Lefteris Loukas**
### Dataset Summary
This dataset card is based on the paper **EDGAR-CORPUS: Billions of Tokens Make The World Go Round** authored by _Lefteris Loukas et.al_, as published in the _ECONLP 2021_ workshop.
This dataset contains the annual reports of public companies from 1993-2020 from SEC EDGAR filings.
There is supported functionality to load a specific year.
Care: since this is a corpus dataset, different `train/val/test` splits do not have any special meaning. It's the default HF card format to have train/val/test splits.
If you wish to load specific year(s) of specific companies, you probably want to use the open-source software which generated this dataset, EDGAR-CRAWLER: https://github.com/nlpaueb/edgar-crawler.
### Supported Tasks
This is a raw dataset/corpus for financial NLP.
As such, there are no annotations or labels.
### Languages
The EDGAR Filings are in English.
## Dataset Structure
### Data Instances
Refer to the dataset preview.
### Data Fields
**filename**: Name of file on EDGAR from which the report was extracted.<br>
**cik**: EDGAR identifier for a firm.<br>
**year**: Year of report.<br>
**section_1**: Corressponding section of the Annual Report.<br>
**section_1A**: Corressponding section of the Annual Report.<br>
**section_1B**: Corressponding section of the Annual Report.<br>
**section_2**: Corressponding section of the Annual Report.<br>
**section_3**: Corressponding section of the Annual Report.<br>
**section_4**: Corressponding section of the Annual Report.<br>
**section_5**: Corressponding section of the Annual Report.<br>
**section_6**: Corressponding section of the Annual Report.<br>
**section_7**: Corressponding section of the Annual Report.<br>
**section_7A**: Corressponding section of the Annual Report.<br>
**section_8**: Corressponding section of the Annual Report.<br>
**section_9**: Corressponding section of the Annual Report.<br>
**section_9A**: Corressponding section of the Annual Report.<br>
**section_9B**: Corressponding section of the Annual Report.<br>
**section_10**: Corressponding section of the Annual Report.<br>
**section_11**: Corressponding section of the Annual Report.<br>
**section_12**: Corressponding section of the Annual Report.<br>
**section_13**: Corressponding section of the Annual Report.<br>
**section_14**: Corressponding section of the Annual Report.<br>
**section_15**: Corressponding section of the Annual Report.<br>
```python
import datasets
# Load the entire dataset
raw_dataset = datasets.load_dataset("eloukas/edgar-corpus", "full")
# Load a specific year and split
year_1993_training_dataset = datasets.load_dataset("eloukas/edgar-corpus", "year_1993", split="train")
```
### Data Splits
| Config | Training | Validation | Test |
| --------- | -------- | ---------- | ------ |
| full | 176,289 | 22,050 | 22,036 |
| year_1993 | 1,060 | 133 | 133 |
| year_1994 | 2,083 | 261 | 260 |
| year_1995 | 4,110 | 514 | 514 |
| year_1996 | 7,589 | 949 | 949 |
| year_1997 | 8,084 | 1,011 | 1,011 |
| year_1998 | 8,040 | 1,006 | 1,005 |
| year_1999 | 7,864 | 984 | 983 |
| year_2000 | 7,589 | 949 | 949 |
| year_2001 | 7,181 | 898 | 898 |
| year_2002 | 6,636 | 830 | 829 |
| year_2003 | 6,672 | 834 | 834 |
| year_2004 | 7,111 | 889 | 889 |
| year_2005 | 7,113 | 890 | 889 |
| year_2006 | 7,064 | 883 | 883 |
| year_2007 | 6,683 | 836 | 835 |
| year_2008 | 7,408 | 927 | 926 |
| year_2009 | 7,336 | 917 | 917 |
| year_2010 | 7,013 | 877 | 877 |
| year_2011 | 6,724 | 841 | 840 |
| year_2012 | 6,479 | 810 | 810 |
| year_2013 | 6,372 | 797 | 796 |
| year_2014 | 6,261 | 783 | 783 |
| year_2015 | 6,028 | 754 | 753 |
| year_2016 | 5,812 | 727 | 727 |
| year_2017 | 5,635 | 705 | 704 |
| year_2018 | 5,508 | 689 | 688 |
| year_2019 | 5,354 | 670 | 669 |
| year_2020 | 5,480 | 686 | 685 |
## Dataset Creation
### Source Data
#### Initial Data Collection and Normalization
Initial data was collected and processed by the authors of the research paper **EDGAR-CORPUS: Billions of Tokens Make The World Go Round**.
#### Who are the source language producers?
Public firms filing with the SEC.
### Annotations
#### Annotation process
NA
#### Who are the annotators?
NA
### Personal and Sensitive Information
The dataset contains public filings data from SEC.
## Considerations for Using the Data
### Social Impact of Dataset
Low to none.
### Discussion of Biases
The dataset is about financial information of public companies and as such the tone and style of text is in line with financial literature.
### Other Known Limitations
The dataset needs further cleaning for improved performance.
## Additional Information
### Licensing Information
EDGAR data is publicly available.
### Shoutout
Huge shoutout to [@JanosAudran](https://huggingface.co/JanosAudran) for the HF Card setup!
## Citation
If this work helps or inspires you in any way, please consider citing the relevant paper published at the [3rd Economics and Natural Language Processing (ECONLP) workshop](https://lt3.ugent.be/econlp/) at EMNLP 2021 (Punta Cana, Dominican Republic):
```
@inproceedings{loukas-etal-2021-edgar,
title = "{EDGAR}-{CORPUS}: Billions of Tokens Make The World Go Round",
author = "Loukas, Lefteris and
Fergadiotis, Manos and
Androutsopoulos, Ion and
Malakasiotis, Prodromos",
booktitle = "Proceedings of the Third Workshop on Economics and Natural Language Processing",
month = nov,
year = "2021",
address = "Punta Cana, Dominican Republic",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.econlp-1.2",
pages = "13--18",
}
```
### References
- [Research Paper] Lefteris Loukas, Manos Fergadiotis, Ion Androutsopoulos, and, Prodromos Malakasiotis. EDGAR-CORPUS: Billions of Tokens Make The World Go Round. Third Workshop on Economics and Natural Language Processing (ECONLP). https://arxiv.org/abs/2109.14394 - Punta Cana, Dominican Republic, November 2021.
- [Software] Lefteris Loukas, Manos Fergadiotis, Ion Androutsopoulos, and, Prodromos Malakasiotis. EDGAR-CRAWLER. https://github.com/nlpaueb/edgar-crawler (2021)
- [EDGAR CORPUS, but in zip files] EDGAR CORPUS: A corpus for financial NLP research, built from SEC's EDGAR. https://zenodo.org/record/5528490 (2021)
- [Word Embeddings] EDGAR-W2V: Word2vec Embeddings trained on EDGAR-CORPUS. https://zenodo.org/record/5524358 (2021)
- [Applied Research paper where EDGAR-CORPUS is used] Lefteris Loukas, Manos Fergadiotis, Ilias Chalkidis, Eirini Spyropoulou, Prodromos Malakasiotis, Ion Androutsopoulos, and, George Paliouras. FiNER: Financial Numeric Entity Recognition for XBRL Tagging. Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). https://doi.org/10.18653/v1/2022.acl-long.303 (2022)
|
amlan107/para | ---
dataset_info:
features:
- name: bn
dtype: string
- name: en
dtype: string
splits:
- name: parallel
num_bytes: 12827861
num_examples: 50000
download_size: 6965146
dataset_size: 12827861
configs:
- config_name: default
data_files:
- split: parallel
path: data/parallel-*
---
|
MariaBi/DuolingoAnalysis | ---
license: cc
---
|
freshpearYoon/train_free_28 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 9604561288
num_examples: 10000
download_size: 1363578312
dataset_size: 9604561288
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
codegood/YU_SC | ---
license: apache-2.0
---
|
OVERLINK/WuWa | ---
license: lgpl-3.0
---
|
CVasNLPExperiments/OK_VQA_google_flan_ul2_mode_VQAv2_visclues_detection_caption_module_filter_ns_100_OE | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0
num_bytes: 18358
num_examples: 100
download_size: 11109
dataset_size: 18358
---
# Dataset Card for "OK_VQA_google_flan_ul2_mode_VQAv2_visclues_detection_caption_module_filter_ns_100_OE"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
UrbanSounds/urban_sounds_small | ---
license: apache-2.0
task_categories:
- audio-classification
language:
- nl
- en
tags:
- audio event
- noise pollution
- urban
size_categories:
- n<1K
---
The Urban Sounds dataset consists of audio samples collected in Amsterdam in the period 2018 - 2020.
The datasamples were collected for a project to create a sensor to classify audio events, with the goal of tackling noise pollution in the city.
This 'urban sounds small' dataset is a small part of the dataset, used for testing and prototyping purposes.
More on the sensor can be found here: https://github.com/sensemakersamsterdam/OpenEars
|
pa-shk/re_dial_tmdb | ---
dataset_info:
- config_name: re_dial
features:
- name: movieMentions
list:
- name: movieId
dtype: string
- name: movieName
dtype: string
- name: respondentQuestions
list:
- name: movieId
dtype: string
- name: suggested
dtype: int32
- name: seen
dtype: int32
- name: liked
dtype: int32
- name: messages
list:
- name: timeOffset
dtype: int32
- name: text
dtype: string
- name: senderWorkerId
dtype: int32
- name: messageId
dtype: int32
- name: conversationId
dtype: int32
- name: respondentWorkerId
dtype: int32
- name: initiatorWorkerId
dtype: int32
- name: initiatorQuestions
list:
- name: movieId
dtype: string
- name: suggested
dtype: int32
- name: seen
dtype: int32
- name: liked
dtype: int32
- name: recommended_movies
sequence:
sequence: string
- name: liked_movies
sequence:
sequence: string
- name: dialogs
list:
list:
- name: messageId
dtype: int64
- name: senderWorkerId
dtype: int64
- name: text
dtype: string
- name: timeOffset
dtype: int64
- name: formatted_dialogs
sequence: string
splits:
- name: train
num_bytes: 70328286.69120033
num_examples: 8554
- name: val
num_bytes: 10770406.308799675
num_examples: 1310
- name: test
num_bytes: 9788058
num_examples: 1310
download_size: 25397794
dataset_size: 90886751.0
- config_name: tmdb
features:
- name: id
dtype: string
- name: name
dtype: string
- name: metadata
struct:
- name: adult
dtype: bool
- name: budget
dtype: int64
- name: genres
dtype: string
- name: imdb_id
dtype: string
- name: original_language
dtype: string
- name: original_title
dtype: string
- name: overview
dtype: string
- name: popularity
dtype: float64
- name: production_companies
dtype: string
- name: production_countries
dtype: string
- name: release_date
dtype: string
- name: revenue
dtype: int64
- name: runtime
dtype: int64
- name: spoken_languages
dtype: string
- name: status
dtype: string
- name: tagline
dtype: string
- name: vote_average
dtype: float64
- name: vote_count
dtype: int64
splits:
- name: train
num_bytes: 3557601
num_examples: 6629
download_size: 2083449
dataset_size: 3557601
configs:
- config_name: re_dial
data_files:
- split: train
path: re_dial/train-*
- split: val
path: re_dial/val-*
- split: test
path: re_dial/test-*
- config_name: tmdb
data_files:
- split: train
path: tmdb/train-*
---
|
TR-LLMs/Open-Platypus-TR | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
- name: data_source
dtype: string
splits:
- name: train
num_bytes: 33346286
num_examples: 24926
download_size: 17039616
dataset_size: 33346286
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
markmp/cool_new_dataset | ---
dataset_info:
features:
- name: name
dtype: string
- name: description
dtype: string
- name: ad
dtype: string
splits:
- name: train
num_bytes: 4972
num_examples: 5
download_size: 12096
dataset_size: 4972
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "cool_new_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/data-standardized_cluster_21_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 9681643
num_examples: 9462
download_size: 4191325
dataset_size: 9681643
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data-standardized_cluster_21_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.