datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
iamnguyen/ds_by_sys_prompt_15 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 1636232321.8710828
num_examples: 959340
download_size: 1139633886
dataset_size: 1636232321.8710828
---
# Dataset Card for "ds_by_sys_prompt_15"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ALBADDAWI/PII | ---
license: apache-2.0
---
|
qbourbon/pb_trainset-1 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': 000_airplane
'1': 001_alarm_clock
'2': 002_angel
'3': 003_ant
'4': 004_apple
'5': 005_arm
'6': 006_armchair
'7': 007_ashtray
'8': 008_axe
'9': 009_backpack
'10': 010_banana
'11': 011_barn
'12': 012_baseball_bat
'13': 013_basket
'14': 014_bathtub
'15': 015_bear_(animal)
'16': 016_bed
'17': 017_bee
'18': 018_beer-mug
'19': 019_bell
'20': 020_bench
'21': 021_bicycle
'22': 022_binoculars
'23': 023_blimp
'24': 024_book
'25': 025_bookshelf
'26': 026_boomerang
'27': 027_bottle_opener
'28': 028_bowl
'29': 029_brain
'30': 030_bread
'31': 031_bridge
'32': 032_bulldozer
'33': 033_bus
'34': 034_bush
'35': 035_butterfly
'36': 036_cabinet
'37': 037_cactus
'38': 038_cake
'39': 039_calculator
'40': 040_camel
'41': 041_camera
'42': 042_candle
'43': 043_cannon
'44': 044_canoe
'45': 045_car_(sedan)
'46': 046_carrot
'47': 047_castle
'48': 048_cat
'49': 049_cell_phone
'50': 050_chair
'51': 051_chandelier
'52': 052_church
'53': 053_cigarette
'54': 054_cloud
'55': 055_comb
'56': 056_computer_monitor
'57': 057_computer-mouse
'58': 058_couch
'59': 059_cow
'60': 060_crab
'61': 061_crane_(machine)
'62': 062_crocodile
'63': 063_crown
'64': 064_cup
'65': 065_diamond
'66': 066_dog
'67': 067_dolphin
'68': 068_donut
'69': 069_door
'70': 070_door_handle
'71': 071_dragon
'72': 072_duck
'73': 073_ear
'74': 074_elephant
'75': 075_envelope
'76': 076_eye
'77': 077_eyeglasses
'78': 078_face
'79': 079_fan
'80': 080_feather
'81': 081_fire_hydrant
'82': 082_fish
'83': 083_flashlight
'84': 084_floor_lamp
'85': 085_flower_with_stem
'86': 086_flying_bird
'87': 087_flying_saucer
'88': 088_foot
'89': 089_fork
'90': 090_frog
'91': 091_frying-pan
'92': 092_giraffe
'93': 093_grapes
'94': 094_grenade
'95': 095_guitar
'96': 096_hamburger
'97': 097_hammer
'98': 098_hand
'99': 099_harp
'100': 100_hat
'101': 101_head
'102': 102_head-phones
'103': 103_hedgehog
'104': 104_helicopter
'105': 105_helmet
'106': 106_horse
'107': 107_hot_air_balloon
'108': 108_hot-dog
'109': 109_hourglass
'110': 110_house
'111': 111_human-skeleton
'112': 112_ice-cream-cone
'113': 113_ipod
'114': 114_kangaroo
'115': 115_key
'116': 116_keyboard
'117': 117_knife
'118': 118_ladder
'119': 119_laptop
'120': 120_leaf
'121': 121_lightbulb
'122': 122_lighter
'123': 123_lion
'124': 124_lobster
'125': 125_loudspeaker
'126': 126_mailbox
'127': 127_megaphone
'128': 128_mermaid
'129': 129_microphone
'130': 130_microscope
'131': 131_monkey
'132': 132_moon
'133': 133_mosquito
'134': 134_motorbike
'135': 135_mouse_(animal)
'136': 136_mouth
'137': 137_mug
'138': 138_mushroom
'139': 139_nose
'140': 140_octopus
'141': 141_owl
'142': 142_palm_tree
'143': 143_panda
'144': 144_paper_clip
'145': 145_parachute
'146': 146_parking_meter
'147': 147_parrot
'148': 148_pear
'149': 149_pen
'150': 150_penguin
'151': 151_person_sitting
'152': 152_person_walking
'153': 153_piano
'154': 154_pickup_truck
'155': 155_pig
'156': 156_pigeon
'157': 157_pineapple
'158': 158_pipe_(for_smoking)
'159': 159_pizza
'160': 160_potted_plant
'161': 161_power_outlet
'162': 162_present
'163': 163_pretzel
'164': 164_pumpkin
'165': 165_purse
'166': 166_rabbit
'167': 167_race_car
'168': 168_radio
'169': 169_rainbow
'170': 170_revolver
'171': 171_rifle
'172': 172_rollerblades
'173': 173_rooster
'174': 174_sailboat
'175': 175_santa_claus
'176': 176_satellite
'177': 177_satellite_dish
'178': 178_saxophone
'179': 179_scissors
'180': 180_scorpion
'181': 181_screwdriver
'182': 182_sea_turtle
'183': 183_seagull
'184': 184_shark
'185': 185_sheep
'186': 186_ship
'187': 187_shoe
'188': 188_shovel
'189': 189_skateboard
'190': 190_skull
'191': 191_skyscraper
'192': 192_snail
'193': 193_snake
'194': 194_snowboard
'195': 195_snowman
'196': 196_socks
'197': 197_space_shuttle
'198': 198_speed-boat
'199': 199_spider
'200': 200_sponge_bob
'201': 201_spoon
'202': 202_squirrel
'203': 203_standing_bird
'204': 204_stapler
'205': 205_strawberry
'206': 206_streetlight
'207': 207_submarine
'208': 208_suitcase
'209': 209_sun
'210': 210_suv
'211': 211_swan
'212': 212_sword
'213': 213_syringe
'214': 214_t-shirt
'215': 215_table
'216': 216_tablelamp
'217': 217_teacup
'218': 218_teapot
'219': 219_teddy-bear
'220': 220_telephone
'221': 221_tennis-racket
'222': 222_tent
'223': 223_tiger
'224': 224_tire
'225': 225_toilet
'226': 226_tomato
'227': 227_tooth
'228': 228_toothbrush
'229': 229_tractor
'230': 230_traffic_light
'231': 231_train
'232': 232_tree
'233': 233_trombone
'234': 234_trousers
'235': 235_truck
'236': 236_trumpet
'237': 237_tv
'238': 238_umbrella
'239': 239_van
'240': 240_vase
'241': 241_violin
'242': 242_walkie_talkie
'243': 243_wheel
'244': 244_wheelbarrow
'245': 245_windmill
'246': 246_wine-bottle
'247': 247_wineglass
'248': 248_wrist-watch
'249': 249_zebra
'250': mistery_category
splits:
- name: train
num_bytes: 35887531.19796168
num_examples: 1248
download_size: 36199765
dataset_size: 35887531.19796168
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_jeff31415__TinyLlama-1.1B-1.5T-OpenOrca-Alpha | ---
pretty_name: Evaluation run of jeff31415/TinyLlama-1.1B-1.5T-OpenOrca-Alpha
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jeff31415/TinyLlama-1.1B-1.5T-OpenOrca-Alpha](https://huggingface.co/jeff31415/TinyLlama-1.1B-1.5T-OpenOrca-Alpha)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jeff31415__TinyLlama-1.1B-1.5T-OpenOrca-Alpha\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-09T19:58:29.731575](https://huggingface.co/datasets/open-llm-leaderboard/details_jeff31415__TinyLlama-1.1B-1.5T-OpenOrca-Alpha/blob/main/results_2024-03-09T19-58-29.731575.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26200232497358983,\n\
\ \"acc_stderr\": 0.030975360992871986,\n \"acc_norm\": 0.26329067860989236,\n\
\ \"acc_norm_stderr\": 0.03176130492930688,\n \"mc1\": 0.2386780905752754,\n\
\ \"mc1_stderr\": 0.014922629695456418,\n \"mc2\": 0.4051814653873621,\n\
\ \"mc2_stderr\": 0.014454247268086058\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.29948805460750855,\n \"acc_stderr\": 0.013385021637313563,\n\
\ \"acc_norm\": 0.32764505119453924,\n \"acc_norm_stderr\": 0.013715847940719346\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4182433778131846,\n\
\ \"acc_stderr\": 0.0049226246369452435,\n \"acc_norm\": 0.5377414857598088,\n\
\ \"acc_norm_stderr\": 0.004975546018950675\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.28888888888888886,\n\
\ \"acc_stderr\": 0.0391545063041425,\n \"acc_norm\": 0.28888888888888886,\n\
\ \"acc_norm_stderr\": 0.0391545063041425\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.34868421052631576,\n \"acc_stderr\": 0.03878139888797609,\n\
\ \"acc_norm\": 0.34868421052631576,\n \"acc_norm_stderr\": 0.03878139888797609\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21132075471698114,\n \"acc_stderr\": 0.025125766484827845,\n\
\ \"acc_norm\": 0.21132075471698114,\n \"acc_norm_stderr\": 0.025125766484827845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.19444444444444445,\n\
\ \"acc_stderr\": 0.03309615177059008,\n \"acc_norm\": 0.19444444444444445,\n\
\ \"acc_norm_stderr\": 0.03309615177059008\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n\
\ \"acc_stderr\": 0.031862098516411426,\n \"acc_norm\": 0.2254335260115607,\n\
\ \"acc_norm_stderr\": 0.031862098516411426\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929775,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929775\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.20851063829787234,\n \"acc_stderr\": 0.026556982117838746,\n\
\ \"acc_norm\": 0.20851063829787234,\n \"acc_norm_stderr\": 0.026556982117838746\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.0414243971948936,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.0414243971948936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n\
\ \"acc_stderr\": 0.03567016675276864,\n \"acc_norm\": 0.1984126984126984,\n\
\ \"acc_norm_stderr\": 0.03567016675276864\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.27741935483870966,\n\
\ \"acc_stderr\": 0.025470196835900055,\n \"acc_norm\": 0.27741935483870966,\n\
\ \"acc_norm_stderr\": 0.025470196835900055\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.03178529710642749,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.03178529710642749\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.25252525252525254,\n \"acc_stderr\": 0.030954055470365904,\n \"\
acc_norm\": 0.25252525252525254,\n \"acc_norm_stderr\": 0.030954055470365904\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.23316062176165803,\n \"acc_stderr\": 0.030516111371476008,\n\
\ \"acc_norm\": 0.23316062176165803,\n \"acc_norm_stderr\": 0.030516111371476008\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.24871794871794872,\n \"acc_stderr\": 0.0219169577092138,\n \
\ \"acc_norm\": 0.24871794871794872,\n \"acc_norm_stderr\": 0.0219169577092138\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24369747899159663,\n \"acc_stderr\": 0.027886828078380572,\n\
\ \"acc_norm\": 0.24369747899159663,\n \"acc_norm_stderr\": 0.027886828078380572\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24954128440366974,\n \"acc_stderr\": 0.018553897629501624,\n \"\
acc_norm\": 0.24954128440366974,\n \"acc_norm_stderr\": 0.018553897629501624\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.22058823529411764,\n \"acc_stderr\": 0.029102254389674082,\n \"\
acc_norm\": 0.22058823529411764,\n \"acc_norm_stderr\": 0.029102254389674082\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.31223628691983124,\n \"acc_stderr\": 0.030165137867847008,\n \
\ \"acc_norm\": 0.31223628691983124,\n \"acc_norm_stderr\": 0.030165137867847008\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.23318385650224216,\n\
\ \"acc_stderr\": 0.02838039114709472,\n \"acc_norm\": 0.23318385650224216,\n\
\ \"acc_norm_stderr\": 0.02838039114709472\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.19083969465648856,\n \"acc_stderr\": 0.03446513350752599,\n\
\ \"acc_norm\": 0.19083969465648856,\n \"acc_norm_stderr\": 0.03446513350752599\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2975206611570248,\n \"acc_stderr\": 0.04173349148083499,\n \"\
acc_norm\": 0.2975206611570248,\n \"acc_norm_stderr\": 0.04173349148083499\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3067484662576687,\n \"acc_stderr\": 0.036230899157241474,\n\
\ \"acc_norm\": 0.3067484662576687,\n \"acc_norm_stderr\": 0.036230899157241474\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.1875,\n\
\ \"acc_stderr\": 0.0370468111477387,\n \"acc_norm\": 0.1875,\n \
\ \"acc_norm_stderr\": 0.0370468111477387\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2621359223300971,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.2621359223300971,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24358974358974358,\n\
\ \"acc_stderr\": 0.02812096650391439,\n \"acc_norm\": 0.24358974358974358,\n\
\ \"acc_norm_stderr\": 0.02812096650391439\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26436781609195403,\n\
\ \"acc_stderr\": 0.01576998484069052,\n \"acc_norm\": 0.26436781609195403,\n\
\ \"acc_norm_stderr\": 0.01576998484069052\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25139664804469275,\n\
\ \"acc_stderr\": 0.014508979453553995,\n \"acc_norm\": 0.25139664804469275,\n\
\ \"acc_norm_stderr\": 0.014508979453553995\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.024739981355113592,\n\
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.024739981355113592\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.2990353697749196,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.24691358024691357,\n \"acc_stderr\": 0.023993501709042096,\n\
\ \"acc_norm\": 0.24691358024691357,\n \"acc_norm_stderr\": 0.023993501709042096\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25177304964539005,\n \"acc_stderr\": 0.0258921511567094,\n \
\ \"acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.0258921511567094\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23859191655801826,\n\
\ \"acc_stderr\": 0.010885929742002223,\n \"acc_norm\": 0.23859191655801826,\n\
\ \"acc_norm_stderr\": 0.010885929742002223\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3014705882352941,\n \"acc_stderr\": 0.027875982114273168,\n\
\ \"acc_norm\": 0.3014705882352941,\n \"acc_norm_stderr\": 0.027875982114273168\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26143790849673204,\n \"acc_stderr\": 0.017776947157528044,\n \
\ \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.017776947157528044\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.03895091015724137,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.03895091015724137\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24081632653061225,\n \"acc_stderr\": 0.027372942201788163,\n\
\ \"acc_norm\": 0.24081632653061225,\n \"acc_norm_stderr\": 0.027372942201788163\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.30845771144278605,\n\
\ \"acc_stderr\": 0.032658195885126994,\n \"acc_norm\": 0.30845771144278605,\n\
\ \"acc_norm_stderr\": 0.032658195885126994\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.19879518072289157,\n\
\ \"acc_stderr\": 0.031069390260789437,\n \"acc_norm\": 0.19879518072289157,\n\
\ \"acc_norm_stderr\": 0.031069390260789437\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0312678171466318,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0312678171466318\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2386780905752754,\n\
\ \"mc1_stderr\": 0.014922629695456418,\n \"mc2\": 0.4051814653873621,\n\
\ \"mc2_stderr\": 0.014454247268086058\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5895816890292028,\n \"acc_stderr\": 0.013825107120035861\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.006065200909780136,\n \
\ \"acc_stderr\": 0.0021386703014604483\n }\n}\n```"
repo_url: https://huggingface.co/jeff31415/TinyLlama-1.1B-1.5T-OpenOrca-Alpha
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|arc:challenge|25_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|gsm8k|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hellaswag|10_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|winogrande|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-09T19-58-29.731575.parquet'
- config_name: results
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- results_2024-03-09T19-58-29.731575.parquet
- split: latest
path:
- results_2024-03-09T19-58-29.731575.parquet
---
# Dataset Card for Evaluation run of jeff31415/TinyLlama-1.1B-1.5T-OpenOrca-Alpha
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jeff31415/TinyLlama-1.1B-1.5T-OpenOrca-Alpha](https://huggingface.co/jeff31415/TinyLlama-1.1B-1.5T-OpenOrca-Alpha) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jeff31415__TinyLlama-1.1B-1.5T-OpenOrca-Alpha",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-09T19:58:29.731575](https://huggingface.co/datasets/open-llm-leaderboard/details_jeff31415__TinyLlama-1.1B-1.5T-OpenOrca-Alpha/blob/main/results_2024-03-09T19-58-29.731575.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26200232497358983,
"acc_stderr": 0.030975360992871986,
"acc_norm": 0.26329067860989236,
"acc_norm_stderr": 0.03176130492930688,
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456418,
"mc2": 0.4051814653873621,
"mc2_stderr": 0.014454247268086058
},
"harness|arc:challenge|25": {
"acc": 0.29948805460750855,
"acc_stderr": 0.013385021637313563,
"acc_norm": 0.32764505119453924,
"acc_norm_stderr": 0.013715847940719346
},
"harness|hellaswag|10": {
"acc": 0.4182433778131846,
"acc_stderr": 0.0049226246369452435,
"acc_norm": 0.5377414857598088,
"acc_norm_stderr": 0.004975546018950675
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.0391545063041425,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.0391545063041425
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.34868421052631576,
"acc_stderr": 0.03878139888797609,
"acc_norm": 0.34868421052631576,
"acc_norm_stderr": 0.03878139888797609
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21132075471698114,
"acc_stderr": 0.025125766484827845,
"acc_norm": 0.21132075471698114,
"acc_norm_stderr": 0.025125766484827845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.19444444444444445,
"acc_stderr": 0.03309615177059008,
"acc_norm": 0.19444444444444445,
"acc_norm_stderr": 0.03309615177059008
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.031862098516411426,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.031862098516411426
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929775,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929775
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20851063829787234,
"acc_stderr": 0.026556982117838746,
"acc_norm": 0.20851063829787234,
"acc_norm_stderr": 0.026556982117838746
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.0414243971948936,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.0414243971948936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.03567016675276864,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.03567016675276864
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.27741935483870966,
"acc_stderr": 0.025470196835900055,
"acc_norm": 0.27741935483870966,
"acc_norm_stderr": 0.025470196835900055
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.03178529710642749,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.03178529710642749
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25252525252525254,
"acc_stderr": 0.030954055470365904,
"acc_norm": 0.25252525252525254,
"acc_norm_stderr": 0.030954055470365904
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23316062176165803,
"acc_stderr": 0.030516111371476008,
"acc_norm": 0.23316062176165803,
"acc_norm_stderr": 0.030516111371476008
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24871794871794872,
"acc_stderr": 0.0219169577092138,
"acc_norm": 0.24871794871794872,
"acc_norm_stderr": 0.0219169577092138
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24369747899159663,
"acc_stderr": 0.027886828078380572,
"acc_norm": 0.24369747899159663,
"acc_norm_stderr": 0.027886828078380572
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24954128440366974,
"acc_stderr": 0.018553897629501624,
"acc_norm": 0.24954128440366974,
"acc_norm_stderr": 0.018553897629501624
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22058823529411764,
"acc_stderr": 0.029102254389674082,
"acc_norm": 0.22058823529411764,
"acc_norm_stderr": 0.029102254389674082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.31223628691983124,
"acc_stderr": 0.030165137867847008,
"acc_norm": 0.31223628691983124,
"acc_norm_stderr": 0.030165137867847008
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.23318385650224216,
"acc_stderr": 0.02838039114709472,
"acc_norm": 0.23318385650224216,
"acc_norm_stderr": 0.02838039114709472
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.19083969465648856,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.19083969465648856,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2975206611570248,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.2975206611570248,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3067484662576687,
"acc_stderr": 0.036230899157241474,
"acc_norm": 0.3067484662576687,
"acc_norm_stderr": 0.036230899157241474
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.1875,
"acc_stderr": 0.0370468111477387,
"acc_norm": 0.1875,
"acc_norm_stderr": 0.0370468111477387
},
"harness|hendrycksTest-management|5": {
"acc": 0.2621359223300971,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.2621359223300971,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.02812096650391439,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.02812096650391439
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26436781609195403,
"acc_stderr": 0.01576998484069052,
"acc_norm": 0.26436781609195403,
"acc_norm_stderr": 0.01576998484069052
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25139664804469275,
"acc_stderr": 0.014508979453553995,
"acc_norm": 0.25139664804469275,
"acc_norm_stderr": 0.014508979453553995
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.024739981355113592,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.024739981355113592
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24691358024691357,
"acc_stderr": 0.023993501709042096,
"acc_norm": 0.24691358024691357,
"acc_norm_stderr": 0.023993501709042096
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25177304964539005,
"acc_stderr": 0.0258921511567094,
"acc_norm": 0.25177304964539005,
"acc_norm_stderr": 0.0258921511567094
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23859191655801826,
"acc_stderr": 0.010885929742002223,
"acc_norm": 0.23859191655801826,
"acc_norm_stderr": 0.010885929742002223
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3014705882352941,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.3014705882352941,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.017776947157528044,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.017776947157528044
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.03895091015724137,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.03895091015724137
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24081632653061225,
"acc_stderr": 0.027372942201788163,
"acc_norm": 0.24081632653061225,
"acc_norm_stderr": 0.027372942201788163
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.30845771144278605,
"acc_stderr": 0.032658195885126994,
"acc_norm": 0.30845771144278605,
"acc_norm_stderr": 0.032658195885126994
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.19879518072289157,
"acc_stderr": 0.031069390260789437,
"acc_norm": 0.19879518072289157,
"acc_norm_stderr": 0.031069390260789437
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456418,
"mc2": 0.4051814653873621,
"mc2_stderr": 0.014454247268086058
},
"harness|winogrande|5": {
"acc": 0.5895816890292028,
"acc_stderr": 0.013825107120035861
},
"harness|gsm8k|5": {
"acc": 0.006065200909780136,
"acc_stderr": 0.0021386703014604483
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_yhyhy3__med-orca-instruct-33b | ---
pretty_name: Evaluation run of yhyhy3/med-orca-instruct-33b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yhyhy3/med-orca-instruct-33b](https://huggingface.co/yhyhy3/med-orca-instruct-33b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yhyhy3__med-orca-instruct-33b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-17T22:27:51.480164](https://huggingface.co/datasets/open-llm-leaderboard/details_yhyhy3__med-orca-instruct-33b/blob/main/results_2023-10-17T22-27-51.480164.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"\
em_stderr\": 0.0,\n \"f1\": 6.606543624161075e-05,\n \"f1_stderr\"\
: 2.6666679153418564e-05,\n \"acc\": 0.2525651144435675,\n \"acc_stderr\"\
: 0.007025872980895256\n },\n \"harness|drop|3\": {\n \"em\": 0.0,\n\
\ \"em_stderr\": 0.0,\n \"f1\": 6.606543624161075e-05,\n \"\
f1_stderr\": 2.6666679153418564e-05\n },\n \"harness|gsm8k|5\": {\n \
\ \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.505130228887135,\n \"acc_stderr\": 0.014051745961790513\n\
\ }\n}\n```"
repo_url: https://huggingface.co/yhyhy3/med-orca-instruct-33b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|arc:challenge|25_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|arc:challenge|25_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_26T02_39_23.109820
path:
- '**/details_harness|drop|3_2023-09-26T02-39-23.109820.parquet'
- split: 2023_10_17T22_27_51.480164
path:
- '**/details_harness|drop|3_2023-10-17T22-27-51.480164.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-17T22-27-51.480164.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_26T02_39_23.109820
path:
- '**/details_harness|gsm8k|5_2023-09-26T02-39-23.109820.parquet'
- split: 2023_10_17T22_27_51.480164
path:
- '**/details_harness|gsm8k|5_2023-10-17T22-27-51.480164.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-17T22-27-51.480164.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hellaswag|10_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hellaswag|10_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T13:49:32.359108.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T09:03:49.045450.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T13:49:32.359108.parquet'
- split: 2023_08_18T09_03_49.045450
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T09:03:49.045450.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T09:03:49.045450.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_26T02_39_23.109820
path:
- '**/details_harness|winogrande|5_2023-09-26T02-39-23.109820.parquet'
- split: 2023_10_17T22_27_51.480164
path:
- '**/details_harness|winogrande|5_2023-10-17T22-27-51.480164.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-17T22-27-51.480164.parquet'
- config_name: results
data_files:
- split: 2023_08_09T13_49_32.359108
path:
- results_2023-08-09T13:49:32.359108.parquet
- split: 2023_08_18T09_03_49.045450
path:
- results_2023-08-18T09:03:49.045450.parquet
- split: 2023_09_26T02_39_23.109820
path:
- results_2023-09-26T02-39-23.109820.parquet
- split: 2023_10_17T22_27_51.480164
path:
- results_2023-10-17T22-27-51.480164.parquet
- split: latest
path:
- results_2023-10-17T22-27-51.480164.parquet
---
# Dataset Card for Evaluation run of yhyhy3/med-orca-instruct-33b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yhyhy3/med-orca-instruct-33b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yhyhy3/med-orca-instruct-33b](https://huggingface.co/yhyhy3/med-orca-instruct-33b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yhyhy3__med-orca-instruct-33b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T22:27:51.480164](https://huggingface.co/datasets/open-llm-leaderboard/details_yhyhy3__med-orca-instruct-33b/blob/main/results_2023-10-17T22-27-51.480164.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 6.606543624161075e-05,
"f1_stderr": 2.6666679153418564e-05,
"acc": 0.2525651144435675,
"acc_stderr": 0.007025872980895256
},
"harness|drop|3": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 6.606543624161075e-05,
"f1_stderr": 2.6666679153418564e-05
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.505130228887135,
"acc_stderr": 0.014051745961790513
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Multimodal-Fatima/VQAv2_sample_validation_facebook_opt_1.3b_VQAv2_visclues_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_0_bs_8
num_bytes: 25491982
num_examples: 1000
download_size: 4915915
dataset_size: 25491982
---
# Dataset Card for "VQAv2_sample_validation_facebook_opt_1.3b_VQAv2_visclues_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Umal-exvc/chocolate-captioned-dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 78434533.0
num_examples: 500
download_size: 76921151
dataset_size: 78434533.0
---
# Dataset Card for "chocolate-captioned-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ru3ll/zindi_test | ---
dataset_info:
features:
- name: audio_paths
dtype:
audio:
sampling_rate: 16000
- name: ID
dtype: string
splits:
- name: train
num_bytes: 10452392634.544
num_examples: 6318
download_size: 9044328275
dataset_size: 10452392634.544
---
# Dataset Card for "zindi_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
davidyoungoc/TestAI | ---
license: mit
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245921
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
llm-aes/gemini_hana_full_analyze_rate | ---
dataset_info:
features:
- name: task_id
dtype: string
- name: worker_id
dtype: string
- name: human_label
dtype: int64
- name: llm_label
dtype: int64
- name: generator_1
dtype: string
- name: generator_2
dtype: string
- name: premise
dtype: string
splits:
- name: train
num_bytes: 1133925
num_examples: 5280
download_size: 109547
dataset_size: 1133925
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bigbio/umnsrs |
---
language:
- en
bigbio_language:
- English
license: cc0-1.0
multilinguality: monolingual
bigbio_license_shortname: CC0_1p0
pretty_name: UMNSRS
homepage: https://conservancy.umn.edu/handle/11299/196265/
bigbio_pubmed: False
bigbio_public: True
bigbio_tasks:
- SEMANTIC_SIMILARITY
---
# Dataset Card for UMNSRS
## Dataset Description
- **Homepage:** https://conservancy.umn.edu/handle/11299/196265/
- **Pubmed:** False
- **Public:** True
- **Tasks:** STS
UMNSRS, developed by Pakhomov, et al., consists of 725 clinical term pairs whose semantic similarity and relatedness.
The similarity and relatedness of each term pair was annotated based on a continuous scale by having the resident touch
a bar on a touch sensitive computer screen to indicate the degree of similarity or relatedness.
The following subsets are available:
- similarity: A set of 566 UMLS concept pairs manually rated for semantic similarity (e.g. whale-dolphin) using a
continuous response scale.
- relatedness: A set of 588 UMLS concept pairs manually rated for semantic relatedness (e.g. needle-thread) using a
continuous response scale.
- similarity_mod: Modification of the UMNSRS-Similarity dataset to exclude control samples and those pairs that did not
match text in clinical, biomedical and general English corpora. Exact modifications are detailed in the paper (Corpus
Domain Effects on Distributional Semantic Modeling of Medical Terms. Serguei V.S. Pakhomov, Greg Finley, Reed McEwan,
Yan Wang, and Genevieve B. Melton. Bioinformatics. 2016; 32(23):3635-3644). The resulting dataset contains 449 pairs.
- relatedness_mod: Modification of the UMNSRS-Relatedness dataset to exclude control samples and those pairs that did
not match text in clinical, biomedical and general English corpora. Exact modifications are detailed in the paper
(Corpus Domain Effects on Distributional Semantic Modeling of Medical Terms. Serguei V.S. Pakhomov, Greg Finley,
Reed McEwan, Yan Wang, and Genevieve B. Melton. Bioinformatics. 2016; 32(23):3635-3644).
The resulting dataset contains 458 pairs.
## Citation Information
```
@inproceedings{pakhomov2010semantic,
title={Semantic similarity and relatedness between clinical terms: an experimental study},
author={Pakhomov, Serguei and McInnes, Bridget and Adam, Terrence and Liu, Ying and Pedersen, Ted and Melton, Genevieve B},
booktitle={AMIA annual symposium proceedings},
volume={2010},
pages={572},
year={2010},
organization={American Medical Informatics Association}
}
```
|
bjkim1/chatbot | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 52332872
num_examples: 333917
download_size: 13438152
dataset_size: 52332872
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
el2e10/aya-paraphrase-gujarati | ---
language:
- gu
license: cc
size_categories:
- n<1K
source_datasets:
- extended|ai4bharat/IndicXParaphrase
task_categories:
- text-generation
pretty_name: Aya Paraphrase Gujarati
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: template_lang
dtype: string
- name: template_id
dtype: int64
splits:
- name: train
num_bytes: 623061
num_examples: 1001
download_size: 226705
dataset_size: 623061
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
### Description
This dataset is derived from the already existing dataset made by AI4Bharat. We have used the [IndicXParaphrase](https://huggingface.co/datasets/ai4bharat/IndicXParaphrase) dataset of AI4Bharat to create this instruction style dataset.
We have used the malayalam split of the above mentioned dataset to create this one. This was created as part of [Aya Open Science Initiative](https://sites.google.com/cohere.com/aya-en/home) from Cohere For AI.
IndicXParaphrase is multilingual, and n-way parallel dataset for paraphrase detection in 10 Indic languages. The original dataset(IndicXParaphrase) was made available under the cc-0 license.
### Template
The following templates(Gujarati) where used for converting the original dataset:
```
#Template 1
prompt:
નીચેના વાક્યને અલગ શબ્દોનો ઉપયોગ કરીને લખો: "{original_sentence}"
completion:
{paraphrased_sentence}
```
```
#Template 2
prompt:
નીચેના વાક્યને અલગ રીતે ફરીથી લખો: "{original_sentence}"
completion:
{paraphrased_sentence}
```
```
#Template 3
prompt:
નીચેના વાક્યને બીજા સ્વરૂપમાં ફરીથી લખો: "{original_sentence}"
completion:
{paraphrased_sentence}
```
### Acknowledgement
Thank you, Jay Patel for helping with the preparation of this dataset by providing the Gujarati translation of the above mentioned English prompts. |
gsstein/75-percent-human-dataset | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: summary
dtype: string
- name: text
dtype: string
- name: prompt
dtype: string
- name: generated
dtype: bool
splits:
- name: train
num_bytes: 86459784
num_examples: 15326
- name: test
num_bytes: 3069277
num_examples: 576
- name: validation
num_bytes: 3265591
num_examples: 576
download_size: 57484245
dataset_size: 92794652
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
mrachilles/NTU60PointsSliced | ---
license: mit
---
|
cfahlgren1/wiki_sql_pg_converted | ---
license: mit
---
|
BossBossNJb/cifar10_dataset_th_en | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: img
dtype: image
- name: label
dtype:
class_label:
names:
'0': airplane
'1': automobile
'2': bird
'3': cat
'4': deer
'5': dog
'6': frog
'7': horse
'8': ship
'9': truck
- name: en
dtype: string
- name: th
dtype: string
splits:
- name: train
num_bytes: 115003310.0
num_examples: 50000
- name: test
num_bytes: 23002580.0
num_examples: 10000
download_size: 144125889
dataset_size: 138005890.0
---
# Dataset Card for "cifar10_dataset_th_en"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ailover/ExTES | ---
dataset_info:
features:
- name: example
dtype: string
splits:
- name: train
num_bytes: 76888785
num_examples: 81057
- name: test
num_bytes: 8540402
num_examples: 9006
download_size: 20921859
dataset_size: 85429187
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
HANTIFARAH/Hindawi-Books-dataset | ---
dataset_info:
features:
- name: BookLink
dtype: string
- name: BookName
dtype: string
- name: AuthorName
dtype: string
- name: AboutBook
dtype: string
- name: ChapterLink
dtype: string
- name: ChapterName
dtype: string
- name: ChapterText
dtype: string
- name: AboutAuthor
dtype: string
splits:
- name: train
num_bytes: 1368781503
num_examples: 47460
download_size: 506773060
dataset_size: 1368781503
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Pasulo/IBIS-llama2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 44436
num_examples: 61
download_size: 15399
dataset_size: 44436
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_ericpolewski__AIRIC-The-Intern | ---
pretty_name: Evaluation run of ericpolewski/AIRIC-The-Intern
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ericpolewski/AIRIC-The-Intern](https://huggingface.co/ericpolewski/AIRIC-The-Intern)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ericpolewski__AIRIC-The-Intern\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-21T14:54:24.432126](https://huggingface.co/datasets/open-llm-leaderboard/details_ericpolewski__AIRIC-The-Intern/blob/main/results_2024-03-21T14-54-24.432126.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5142733904823833,\n\
\ \"acc_stderr\": 0.03382655038661089,\n \"acc_norm\": 0.5235571721502557,\n\
\ \"acc_norm_stderr\": 0.03468685992062916,\n \"mc1\": 0.3525091799265606,\n\
\ \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.526711958949235,\n\
\ \"mc2_stderr\": 0.015161165771156261\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.48464163822525597,\n \"acc_stderr\": 0.014604496129394913,\n\
\ \"acc_norm\": 0.5273037542662116,\n \"acc_norm_stderr\": 0.014589589101985994\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5702051384186417,\n\
\ \"acc_stderr\": 0.004940349676769321,\n \"acc_norm\": 0.7706632144991038,\n\
\ \"acc_norm_stderr\": 0.004195470693130787\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.040463368839782514,\n\
\ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.040463368839782514\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.49433962264150944,\n \"acc_stderr\": 0.03077090076385131,\n\
\ \"acc_norm\": 0.49433962264150944,\n \"acc_norm_stderr\": 0.03077090076385131\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5069444444444444,\n\
\ \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.5069444444444444,\n\
\ \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4624277456647399,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.4624277456647399,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.041657747757287644,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.041657747757287644\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.32275132275132273,\n \"acc_stderr\": 0.024078943243597016,\n \"\
acc_norm\": 0.32275132275132273,\n \"acc_norm_stderr\": 0.024078943243597016\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6129032258064516,\n\
\ \"acc_stderr\": 0.02770935967503249,\n \"acc_norm\": 0.6129032258064516,\n\
\ \"acc_norm_stderr\": 0.02770935967503249\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4187192118226601,\n \"acc_stderr\": 0.03471192860518468,\n\
\ \"acc_norm\": 0.4187192118226601,\n \"acc_norm_stderr\": 0.03471192860518468\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.037131580674819115,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.037131580674819115\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6717171717171717,\n \"acc_stderr\": 0.03345678422756776,\n \"\
acc_norm\": 0.6717171717171717,\n \"acc_norm_stderr\": 0.03345678422756776\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7305699481865285,\n \"acc_stderr\": 0.03201867122877794,\n\
\ \"acc_norm\": 0.7305699481865285,\n \"acc_norm_stderr\": 0.03201867122877794\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.025294608023986472,\n\
\ \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.025294608023986472\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712163,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712163\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.032252942323996406,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.032252942323996406\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6752293577981652,\n \"acc_stderr\": 0.020077729109310327,\n \"\
acc_norm\": 0.6752293577981652,\n \"acc_norm_stderr\": 0.020077729109310327\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39814814814814814,\n \"acc_stderr\": 0.033384734032074016,\n \"\
acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6617647058823529,\n \"acc_stderr\": 0.0332057461294543,\n \"acc_norm\"\
: 0.6617647058823529,\n \"acc_norm_stderr\": 0.0332057461294543\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808517,\n \"\
acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808517\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\
\ \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n\
\ \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5725190839694656,\n \"acc_stderr\": 0.04338920305792401,\n\
\ \"acc_norm\": 0.5725190839694656,\n \"acc_norm_stderr\": 0.04338920305792401\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.04345724570292535,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.04345724570292535\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n\
\ \"acc_stderr\": 0.045879047413018105,\n \"acc_norm\": 0.6574074074074074,\n\
\ \"acc_norm_stderr\": 0.045879047413018105\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6073619631901841,\n \"acc_stderr\": 0.03836740907831028,\n\
\ \"acc_norm\": 0.6073619631901841,\n \"acc_norm_stderr\": 0.03836740907831028\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.046897659372781335,\n\
\ \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.046897659372781335\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7905982905982906,\n\
\ \"acc_stderr\": 0.02665569965392275,\n \"acc_norm\": 0.7905982905982906,\n\
\ \"acc_norm_stderr\": 0.02665569965392275\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.01685739124747255,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.01685739124747255\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5982658959537572,\n \"acc_stderr\": 0.026394104177643634,\n\
\ \"acc_norm\": 0.5982658959537572,\n \"acc_norm_stderr\": 0.026394104177643634\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n\
\ \"acc_stderr\": 0.014400296429225619,\n \"acc_norm\": 0.24581005586592178,\n\
\ \"acc_norm_stderr\": 0.014400296429225619\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6013071895424836,\n \"acc_stderr\": 0.028036092273891776,\n\
\ \"acc_norm\": 0.6013071895424836,\n \"acc_norm_stderr\": 0.028036092273891776\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6077170418006431,\n\
\ \"acc_stderr\": 0.027731258647011994,\n \"acc_norm\": 0.6077170418006431,\n\
\ \"acc_norm_stderr\": 0.027731258647011994\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5987654320987654,\n \"acc_stderr\": 0.027272582849839796,\n\
\ \"acc_norm\": 0.5987654320987654,\n \"acc_norm_stderr\": 0.027272582849839796\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.39361702127659576,\n \"acc_stderr\": 0.02914454478159615,\n \
\ \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.02914454478159615\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4002607561929596,\n\
\ \"acc_stderr\": 0.012513582529136215,\n \"acc_norm\": 0.4002607561929596,\n\
\ \"acc_norm_stderr\": 0.012513582529136215\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.39338235294117646,\n \"acc_stderr\": 0.029674288281311183,\n\
\ \"acc_norm\": 0.39338235294117646,\n \"acc_norm_stderr\": 0.029674288281311183\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5098039215686274,\n \"acc_stderr\": 0.0202239460050743,\n \
\ \"acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.0202239460050743\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5836734693877551,\n \"acc_stderr\": 0.031557828165561644,\n\
\ \"acc_norm\": 0.5836734693877551,\n \"acc_norm_stderr\": 0.031557828165561644\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n\
\ \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.7313432835820896,\n\
\ \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.03446296217088427,\n\
\ \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.03446296217088427\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3525091799265606,\n\
\ \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.526711958949235,\n\
\ \"mc2_stderr\": 0.015161165771156261\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7087608524072613,\n \"acc_stderr\": 0.012769029305370695\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.015163002274450341,\n \
\ \"acc_stderr\": 0.0033660229497263507\n }\n}\n```"
repo_url: https://huggingface.co/ericpolewski/AIRIC-The-Intern
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|arc:challenge|25_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|arc:challenge|25_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|gsm8k|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|gsm8k|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hellaswag|10_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hellaswag|10_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T12-30-57.894706.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T14-54-24.432126.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T14-54-24.432126.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- '**/details_harness|winogrande|5_2024-03-21T12-30-57.894706.parquet'
- split: 2024_03_21T14_54_24.432126
path:
- '**/details_harness|winogrande|5_2024-03-21T14-54-24.432126.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-21T14-54-24.432126.parquet'
- config_name: results
data_files:
- split: 2024_03_21T12_30_57.894706
path:
- results_2024-03-21T12-30-57.894706.parquet
- split: 2024_03_21T14_54_24.432126
path:
- results_2024-03-21T14-54-24.432126.parquet
- split: latest
path:
- results_2024-03-21T14-54-24.432126.parquet
---
# Dataset Card for Evaluation run of ericpolewski/AIRIC-The-Intern
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ericpolewski/AIRIC-The-Intern](https://huggingface.co/ericpolewski/AIRIC-The-Intern) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ericpolewski__AIRIC-The-Intern",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-21T14:54:24.432126](https://huggingface.co/datasets/open-llm-leaderboard/details_ericpolewski__AIRIC-The-Intern/blob/main/results_2024-03-21T14-54-24.432126.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5142733904823833,
"acc_stderr": 0.03382655038661089,
"acc_norm": 0.5235571721502557,
"acc_norm_stderr": 0.03468685992062916,
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756547,
"mc2": 0.526711958949235,
"mc2_stderr": 0.015161165771156261
},
"harness|arc:challenge|25": {
"acc": 0.48464163822525597,
"acc_stderr": 0.014604496129394913,
"acc_norm": 0.5273037542662116,
"acc_norm_stderr": 0.014589589101985994
},
"harness|hellaswag|10": {
"acc": 0.5702051384186417,
"acc_stderr": 0.004940349676769321,
"acc_norm": 0.7706632144991038,
"acc_norm_stderr": 0.004195470693130787
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.040463368839782514,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.040463368839782514
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.49433962264150944,
"acc_stderr": 0.03077090076385131,
"acc_norm": 0.49433962264150944,
"acc_norm_stderr": 0.03077090076385131
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5069444444444444,
"acc_stderr": 0.04180806750294938,
"acc_norm": 0.5069444444444444,
"acc_norm_stderr": 0.04180806750294938
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.041657747757287644,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.041657747757287644
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.32275132275132273,
"acc_stderr": 0.024078943243597016,
"acc_norm": 0.32275132275132273,
"acc_norm_stderr": 0.024078943243597016
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6129032258064516,
"acc_stderr": 0.02770935967503249,
"acc_norm": 0.6129032258064516,
"acc_norm_stderr": 0.02770935967503249
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4187192118226601,
"acc_stderr": 0.03471192860518468,
"acc_norm": 0.4187192118226601,
"acc_norm_stderr": 0.03471192860518468
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.037131580674819115,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.037131580674819115
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6717171717171717,
"acc_stderr": 0.03345678422756776,
"acc_norm": 0.6717171717171717,
"acc_norm_stderr": 0.03345678422756776
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7305699481865285,
"acc_stderr": 0.03201867122877794,
"acc_norm": 0.7305699481865285,
"acc_norm_stderr": 0.03201867122877794
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.025294608023986472,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.025294608023986472
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712163,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712163
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.032252942323996406,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.032252942323996406
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6752293577981652,
"acc_stderr": 0.020077729109310327,
"acc_norm": 0.6752293577981652,
"acc_norm_stderr": 0.020077729109310327
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.0332057461294543,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.0332057461294543
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808517,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808517
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5725190839694656,
"acc_stderr": 0.04338920305792401,
"acc_norm": 0.5725190839694656,
"acc_norm_stderr": 0.04338920305792401
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.04345724570292535,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.04345724570292535
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.045879047413018105,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.045879047413018105
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6073619631901841,
"acc_stderr": 0.03836740907831028,
"acc_norm": 0.6073619631901841,
"acc_norm_stderr": 0.03836740907831028
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.046897659372781335,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.046897659372781335
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7905982905982906,
"acc_stderr": 0.02665569965392275,
"acc_norm": 0.7905982905982906,
"acc_norm_stderr": 0.02665569965392275
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.01685739124747255,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.01685739124747255
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5982658959537572,
"acc_stderr": 0.026394104177643634,
"acc_norm": 0.5982658959537572,
"acc_norm_stderr": 0.026394104177643634
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24581005586592178,
"acc_stderr": 0.014400296429225619,
"acc_norm": 0.24581005586592178,
"acc_norm_stderr": 0.014400296429225619
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6013071895424836,
"acc_stderr": 0.028036092273891776,
"acc_norm": 0.6013071895424836,
"acc_norm_stderr": 0.028036092273891776
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6077170418006431,
"acc_stderr": 0.027731258647011994,
"acc_norm": 0.6077170418006431,
"acc_norm_stderr": 0.027731258647011994
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5987654320987654,
"acc_stderr": 0.027272582849839796,
"acc_norm": 0.5987654320987654,
"acc_norm_stderr": 0.027272582849839796
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.39361702127659576,
"acc_stderr": 0.02914454478159615,
"acc_norm": 0.39361702127659576,
"acc_norm_stderr": 0.02914454478159615
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4002607561929596,
"acc_stderr": 0.012513582529136215,
"acc_norm": 0.4002607561929596,
"acc_norm_stderr": 0.012513582529136215
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.39338235294117646,
"acc_stderr": 0.029674288281311183,
"acc_norm": 0.39338235294117646,
"acc_norm_stderr": 0.029674288281311183
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5098039215686274,
"acc_stderr": 0.0202239460050743,
"acc_norm": 0.5098039215686274,
"acc_norm_stderr": 0.0202239460050743
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5836734693877551,
"acc_stderr": 0.031557828165561644,
"acc_norm": 0.5836734693877551,
"acc_norm_stderr": 0.031557828165561644
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.03134328358208954,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.03134328358208954
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.03446296217088427,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.03446296217088427
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756547,
"mc2": 0.526711958949235,
"mc2_stderr": 0.015161165771156261
},
"harness|winogrande|5": {
"acc": 0.7087608524072613,
"acc_stderr": 0.012769029305370695
},
"harness|gsm8k|5": {
"acc": 0.015163002274450341,
"acc_stderr": 0.0033660229497263507
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
approach0/MATH_and_PRM | ---
dataset_info:
features:
- name: src_path
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 15325348.0
num_examples: 13665
- name: test
num_bytes: 8685910.0
num_examples: 8076
download_size: 9782004
dataset_size: 24011258.0
---
# Dataset Card for "MATH_and_PRM"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anirudhajith/test | ---
license: unknown
---
|
irds/argsme_2020-04-01_touche-2021-task-1 | ---
pretty_name: '`argsme/2020-04-01/touche-2021-task-1`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `argsme/2020-04-01/touche-2021-task-1`
The `argsme/2020-04-01/touche-2021-task-1` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/argsme#argsme/2020-04-01/touche-2021-task-1).
# Data
This dataset provides:
- `queries` (i.e., topics); count=50
- `qrels`: (relevance assessments); count=3,711
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/argsme_2020-04-01_touche-2021-task-1', 'queries')
for record in queries:
record # {'query_id': ..., 'title': ...}
qrels = load_dataset('irds/argsme_2020-04-01_touche-2021-task-1', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'quality': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Bondarenko2021Touche,
address = {Berlin Heidelberg New York},
author = {Alexander Bondarenko and Lukas Gienapp and Maik Fr{\"o}be and Meriem Beloucif and Yamen Ajjour and Alexander Panchenko and Chris Biemann and Benno Stein and Henning Wachsmuth and Martin Potthast and Matthias Hagen},
booktitle = {Experimental IR Meets Multilinguality, Multimodality, and Interaction. 12th International Conference of the CLEF Association (CLEF 2021)},
doi = {10.1007/978-3-030-85251-1\_28},
editor = {{K. Sel{\c{c}}uk} Candan and Bogdan Ionescu and Lorraine Goeuriot and Henning M{\"u}ller and Alexis Joly and Maria Maistro and Florina Piroi and Guglielmo Faggioli and Nicola Ferro},
month = sep,
pages = {450-467},
publisher = {Springer},
series = {Lecture Notes in Computer Science},
site = {Bucharest, Romania},
title = {{Overview of Touch{\'e} 2021: Argument Retrieval}},
url = {https://link.springer.com/chapter/10.1007/978-3-030-85251-1_28},
volume = 12880,
year = 2021,
}
```
|
COLAB0102/vozzcarol | ---
license: openrail
---
|
Rarioty/kaggle | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train*
--- |
vwxyzjn/ultrachat_200k_filtered_1707921252 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: query_reference_response
list:
- name: content
dtype: string
- name: role
dtype: string
- name: query_reference_response_token
sequence: int64
- name: query_reference_response_token_len
dtype: int64
- name: query
list:
- name: content
dtype: string
- name: role
dtype: string
- name: query_token
sequence: int64
- name: query_token_len
dtype: int64
- name: reference_response
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: reference_response_token
sequence: int64
- name: reference_response_token_len
dtype: int64
splits:
- name: test_gen
num_bytes: 30484069
num_examples: 1000
- name: test_sft
num_bytes: 39592502
num_examples: 1000
- name: train_gen
num_bytes: 29613744
num_examples: 1000
- name: train_sft
num_bytes: 39521233
num_examples: 1000
download_size: 50859072
dataset_size: 139211548
---
# Args
```python
{'base_model': 'mistralai/Mistral-7B-v0.1',
'check_length_correctness': True,
'debug': True,
'hf_entity': 'vwxyzjn',
'params': TaskQueryHParams(length=3000,
format_str='SUBREDDIT: r/{subreddit}\n'
'\n'
'TITLE: {title}\n'
'\n'
'POST: {post}\n'
'\n'
'TL;DR:',
truncate_field='post',
truncate_text='\n',
padding='pad_token',
pad_token=[32000],
pad_side='left',
max_sft_response_length=1500,
max_sft_query_response_length=4500,
max_rm_response_length=169,
max_rm_query_response_length=638),
'push_to_hub': True}
```
|
Zangs3011/no_robots_llama2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: category
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 29092450
num_examples: 9500
- name: test
num_bytes: 1560738
num_examples: 500
download_size: 18917122
dataset_size: 30653188
---
# Dataset Card for "no_robots_llama2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Crystalcareai/promptengineer | ---
license: apache-2.0
---
|
DBQ/Ounass.Product.prices.Oman | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: Oman - Ounass - Product-level price list
tags:
- webscraping
- ecommerce
- Ounass
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: website_name
dtype: string
- name: competence_date
dtype: string
- name: country_code
dtype: string
- name: currency_code
dtype: string
- name: brand
dtype: string
- name: category1_code
dtype: string
- name: category2_code
dtype: string
- name: category3_code
dtype: string
- name: product_code
dtype: int64
- name: title
dtype: string
- name: itemurl
dtype: string
- name: imageurl
dtype: string
- name: full_price
dtype: float64
- name: price
dtype: float64
- name: full_price_eur
dtype: float64
- name: price_eur
dtype: float64
- name: flg_discount
dtype: int64
splits:
- name: train
num_bytes: 29313663
num_examples: 71958
download_size: 9088483
dataset_size: 29313663
---
# Ounass web scraped data
## About the website
Ounass operates within the **e-commerce industry in EMEA**, specifically within the **Luxury Fashion sector**. It is an integral part of the online retail market in the region, offering premium clothing, accessories, and home and beauty products. With a special focus on **Oman**, Ounass has made a significant footprint in the countrys e-commerce landscape, serving high-end customers with an expansive array of products from global luxury brands. The dataset observed contains **Ecommerce product-listing page (PLP) data** with focus on Ounass operations in Oman, offering insights into the companys product offerings, sales, and performance in the region.
## Link to **dataset**
[Oman - Ounass - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Ounass%20Product-prices%20Oman/r/recgVd1MMemHovvyX)
|
felipesampaio/darwin2 | ---
license: openrail
---
|
azcorpus/azcorpus_v0 | ---
extra_gated_prompt: "You agree to not use the dataset to conduct experiments that cause harm to human subjects."
extra_gated_fields:
Name and Surname: text
Email: text
Company: text
Purpose of Use: text
I agree to use this dataset for non-commercial use ONLY: checkbox
license: openrail
---

# azcorpus - The largest open-source NLP corpus for Azerbaijani (1.9M documents, ~ 18M sentences)
__Due to ongoing maintenance activities, only a portion of our corpus is currently available for access.__
In recent years, deep learning models have been widely used in NLP, yielding excellent results. However, most research works in NLP have focused on high-resource languages such as English. There is a significant gap in NLP research for low- resource languages, Azerbaijani being no exception. So, the availability of adequate corpora for most of the languages is still limited, especially for less-resourced languages such as Azerbaijani.
Therefore, this study aimed to contribute to the NLP research community by building the largest NLP corpus for Azerbaijani language.
## Corpus Summary
“azcorpus” built for text generation purposes contains a total of 1.9 million documents, drawn from a variety of sources. The corpus is designed to provide a broad range of linguistic data for natural language processing and organized by genre and topic, with texts covering a range of subjects including politics, economics, science, culture, sport, history, society and etc.
Texts were selected from a variety of sources including newspapers, magazines, academic journals, wikipedia articles and books. The corpus includes both contemporary and historical texts, providing a rich linguistic and cultural context for natural language processing applications.
___
## Corpus structure
### Data fields
- id: Document id
- text - Newline-separated content
- source - Document source
- reliability - Subjective cleaning evaluation rate
- license - Document license
### Data Splits
This corpus has 3 sources(az_books, az_wiki, and az_news) and 1.876.492 cleaned documents.
| Source name | Number of Instances | Size (GB) |
| ------------- | --------------------|:----------------------|
| az_books | 1,540,732 | 19.5 |
| az_wiki | 98,882 | 0.9 |
| az_news | 236,878 | 3.8 |
___
## Methodology
The first step in building "azcorpus" was to collect text data from various sources.
The news websites were selected based on their popularity and the diversity of topics covered.
Additionally, a collection of ebooks in Azerbaijani was obtained from various online sources. We have expanded our collection to encompass not only fictional literature, but also scholarly works, such as physics, chemistry, and etc.
Source-specific cleaning techniques were applied separately to ensure consistency and accuracy in the corpus. Further information regarding the methodology at hand will be expounded upon in our forthcoming academic paper.
To ensure the ethical use of the corpus, we only collected publicly available data, and we did not collect any personal or sensitive information. We also ensured that the corpus was used for research purposes only and not for commercial gain. In accordance with legal considerations, it is not within our current plans to divulge sources at this time.
___
## Corpus Usage
To obtain comprehensive guidance on how to use "azcorpus", please refer to the detailed usage instructions provided in this [notebook](https://github.com/azcorpus/azcorpus_v0/blob/main/azcorpus_v0.ipynb).
```python
corpus = AzCorpus(access_token = "your_token")
# To obtain a corpus in the raw JSON format
corpus.generate_samples()
```
The download of the entire corpus is a process that entails a time span of approximately 25 minutes to 2 hours, contingent upon the velocity of your internet connection. Presently, our team is engrossed in the refinement of the download script with the objective of enhancing efficiency.
___
## Considerations for Using the Corpus
#### Social Impact
Our work has the potential to contribute to the community by providing a valuable resource for development of new text generation tools in Azerbaijani.
"azcorpus" demonstrates the importance of building large NLP corpora for under-resourced languages, and highlights the social impact of such resources. By making this corpus available to the wider community, we hope to stimulate further research and development in the field of Azerbaijani text generation, and contribute to the broader goal of promoting linguistic diversity and cultural heritage. Future studies could explore the potential community impact of our work.
#### Biases and Limitations
Addressing potential bias in machine learning corpuses is a common concern in research.
In this study, we acknowledge that our dataset may be subject to bias and to mitigate this issue, we employed several techniques.
However, we recognize that our approach may still have limitations.
So, It is important to exercise caution with models trained on a "azcorpus" that has not been adequately filtered,
as this may have an impact on the resulting models. In particular, it is crucial to be mindful of any biases
that may be present in the "azcorpus_v0".
Future work could further investigate these issues and explore additional
methods to address bias in the corpus.
___
## Additional Information
#### Corpus authors
The corpus was put together by [Huseyn Kishiyev](https://www.linkedin.com/in/huseynkishiyev/), [Jafar Isbarov](https://www.linkedin.com/in/jafar-isbarov/), [Kanan Suleymanli](https://www.linkedin.com/in/kanan-suleyman/), [Khazar Heydarli](https://www.linkedin.com/in/xezer-heyderli/), [Leyla Eminova](https://www.linkedin.com/in/leyla-eminova/) and [Nijat Zeynalov](https://www.linkedin.com/in/nijat-zeynalov-064163142/).
The authors' names have been arranged in alphabetical order. All authors have equal rights and contributed equally to this work.
The authors declare no conflict of interest. There are no founding sponsors and no other role in the design of the work other than the authors; in the collection, analysis, or interpretation of data; in the writing of the manuscript, and in the decision to publish the corpus.
___ |
rai-sandeep/dataset_format_1 | ---
dataset_info:
features:
- name: task
dtype: string
- name: format
dtype: string
splits:
- name: train
num_bytes: 1746
num_examples: 2
download_size: 7518
dataset_size: 1746
---
# Dataset Card for "dataset_format_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DrunkEdition/JorgeJeM | ---
license: openrail
---
|
aixsatoshi/Longcontext-aozora-summary | ---
license: cc
language:
- ja
---
長文からの要約データセットです。
長文は以下の青空文庫データセットを利用しました。
[globis-university/aozorabunko-clean](https://huggingface.co/datasets/globis-university/aozorabunko-clean)
# License
CC BY 4.0 |
Daniele/dante-corpus | ---
YAML tags:
- copy-paste the tags obtained with the online tagging app: https://huggingface.co/spaces/huggingface/datasets-tagging
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Additional Information](#additional-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
All literary production of great poet Dante Alighieri.
### Supported Tasks and Leaderboards
Fill Mask task.
### Languages
(Ancient) Italian.
### Contributions
Thanks to [@danielekp](https://github.com/danielekp) for adding this dataset.
|
maxolotl/must-c-en-de-wait4-01 | ---
dataset_info:
features:
- name: current_source
dtype: string
- name: current_target
dtype: string
- name: target_token
dtype: string
splits:
- name: train
num_bytes: 826970789
num_examples: 4513829
- name: test
num_bytes: 10182976
num_examples: 57041
- name: validation
num_bytes: 5115344
num_examples: 26843
download_size: 160313894
dataset_size: 842269109
---
# Dataset Card for "must-c-en-de-wait4-01"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rbojja/medical-vqa | ---
license: mit
---
|
open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v2.1 | ---
pretty_name: Evaluation run of JaeyeonKang/CCK_Asura_v2.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [JaeyeonKang/CCK_Asura_v2.1](https://huggingface.co/JaeyeonKang/CCK_Asura_v2.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v2.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-11T16:16:03.001484](https://huggingface.co/datasets/open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v2.1/blob/main/results_2024-02-11T16-16-03.001484.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7485227705373686,\n\
\ \"acc_stderr\": 0.028690850662240704,\n \"acc_norm\": 0.7515411094238932,\n\
\ \"acc_norm_stderr\": 0.02924740008798966,\n \"mc1\": 0.5152998776009792,\n\
\ \"mc1_stderr\": 0.0174953044731879,\n \"mc2\": 0.6733433696722777,\n\
\ \"mc2_stderr\": 0.014930500543970958\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6766211604095563,\n \"acc_stderr\": 0.013669421630012127,\n\
\ \"acc_norm\": 0.7252559726962458,\n \"acc_norm_stderr\": 0.013044617212771227\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7064329814777933,\n\
\ \"acc_stderr\": 0.004544651976040094,\n \"acc_norm\": 0.8874726150169289,\n\
\ \"acc_norm_stderr\": 0.0031536835304090366\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6888888888888889,\n\
\ \"acc_stderr\": 0.03999262876617721,\n \"acc_norm\": 0.6888888888888889,\n\
\ \"acc_norm_stderr\": 0.03999262876617721\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8486842105263158,\n \"acc_stderr\": 0.029162631596843982,\n\
\ \"acc_norm\": 0.8486842105263158,\n \"acc_norm_stderr\": 0.029162631596843982\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n\
\ \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7811320754716982,\n \"acc_stderr\": 0.02544786382510861,\n\
\ \"acc_norm\": 0.7811320754716982,\n \"acc_norm_stderr\": 0.02544786382510861\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8958333333333334,\n\
\ \"acc_stderr\": 0.02554523921025691,\n \"acc_norm\": 0.8958333333333334,\n\
\ \"acc_norm_stderr\": 0.02554523921025691\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7630057803468208,\n\
\ \"acc_stderr\": 0.032424147574830975,\n \"acc_norm\": 0.7630057803468208,\n\
\ \"acc_norm_stderr\": 0.032424147574830975\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.049665709039785295,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.049665709039785295\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n\
\ \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7276595744680852,\n \"acc_stderr\": 0.029101290698386715,\n\
\ \"acc_norm\": 0.7276595744680852,\n \"acc_norm_stderr\": 0.029101290698386715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6052631578947368,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.6052631578947368,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7379310344827587,\n \"acc_stderr\": 0.03664666337225257,\n\
\ \"acc_norm\": 0.7379310344827587,\n \"acc_norm_stderr\": 0.03664666337225257\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5529100529100529,\n \"acc_stderr\": 0.025606723995777025,\n \"\
acc_norm\": 0.5529100529100529,\n \"acc_norm_stderr\": 0.025606723995777025\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5396825396825397,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.5396825396825397,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8838709677419355,\n \"acc_stderr\": 0.018225757949432302,\n \"\
acc_norm\": 0.8838709677419355,\n \"acc_norm_stderr\": 0.018225757949432302\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.6650246305418719,\n \"acc_stderr\": 0.033208527423483104,\n \"\
acc_norm\": 0.6650246305418719,\n \"acc_norm_stderr\": 0.033208527423483104\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\"\
: 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n\
\ \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9141414141414141,\n \"acc_stderr\": 0.01996022556317289,\n \"\
acc_norm\": 0.9141414141414141,\n \"acc_norm_stderr\": 0.01996022556317289\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9481865284974094,\n \"acc_stderr\": 0.01599622932024412,\n\
\ \"acc_norm\": 0.9481865284974094,\n \"acc_norm_stderr\": 0.01599622932024412\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7897435897435897,\n \"acc_stderr\": 0.020660597485026935,\n\
\ \"acc_norm\": 0.7897435897435897,\n \"acc_norm_stderr\": 0.020660597485026935\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.45925925925925926,\n \"acc_stderr\": 0.03038416923235082,\n \
\ \"acc_norm\": 0.45925925925925926,\n \"acc_norm_stderr\": 0.03038416923235082\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8613445378151261,\n \"acc_stderr\": 0.02244826447683258,\n \
\ \"acc_norm\": 0.8613445378151261,\n \"acc_norm_stderr\": 0.02244826447683258\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5099337748344371,\n \"acc_stderr\": 0.04081677107248436,\n \"\
acc_norm\": 0.5099337748344371,\n \"acc_norm_stderr\": 0.04081677107248436\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9174311926605505,\n \"acc_stderr\": 0.011800361363016569,\n \"\
acc_norm\": 0.9174311926605505,\n \"acc_norm_stderr\": 0.011800361363016569\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.7175925925925926,\n \"acc_stderr\": 0.030701372111510934,\n \"\
acc_norm\": 0.7175925925925926,\n \"acc_norm_stderr\": 0.030701372111510934\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9264705882352942,\n \"acc_stderr\": 0.01831885585008968,\n \"\
acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.01831885585008968\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9029535864978903,\n \"acc_stderr\": 0.01926932302564028,\n \
\ \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.01926932302564028\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8116591928251121,\n\
\ \"acc_stderr\": 0.02624113299640726,\n \"acc_norm\": 0.8116591928251121,\n\
\ \"acc_norm_stderr\": 0.02624113299640726\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8549618320610687,\n \"acc_stderr\": 0.03088466108951538,\n\
\ \"acc_norm\": 0.8549618320610687,\n \"acc_norm_stderr\": 0.03088466108951538\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9256198347107438,\n \"acc_stderr\": 0.02395268883667674,\n \"\
acc_norm\": 0.9256198347107438,\n \"acc_norm_stderr\": 0.02395268883667674\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n\
\ \"acc_stderr\": 0.03434300243631,\n \"acc_norm\": 0.8518518518518519,\n\
\ \"acc_norm_stderr\": 0.03434300243631\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8282208588957055,\n \"acc_stderr\": 0.02963471727237103,\n\
\ \"acc_norm\": 0.8282208588957055,\n \"acc_norm_stderr\": 0.02963471727237103\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6071428571428571,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.6071428571428571,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.03288180278808629,\n\
\ \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.03288180278808629\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9188034188034188,\n\
\ \"acc_stderr\": 0.01789378490401853,\n \"acc_norm\": 0.9188034188034188,\n\
\ \"acc_norm_stderr\": 0.01789378490401853\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036623,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036623\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8914431673052363,\n\
\ \"acc_stderr\": 0.011124283175851181,\n \"acc_norm\": 0.8914431673052363,\n\
\ \"acc_norm_stderr\": 0.011124283175851181\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8294797687861272,\n \"acc_stderr\": 0.020247961569303728,\n\
\ \"acc_norm\": 0.8294797687861272,\n \"acc_norm_stderr\": 0.020247961569303728\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6491620111731844,\n\
\ \"acc_stderr\": 0.015961036675230973,\n \"acc_norm\": 0.6491620111731844,\n\
\ \"acc_norm_stderr\": 0.015961036675230973\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8169934640522876,\n \"acc_stderr\": 0.022140767512880973,\n\
\ \"acc_norm\": 0.8169934640522876,\n \"acc_norm_stderr\": 0.022140767512880973\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8231511254019293,\n\
\ \"acc_stderr\": 0.0216700588855108,\n \"acc_norm\": 0.8231511254019293,\n\
\ \"acc_norm_stderr\": 0.0216700588855108\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8395061728395061,\n \"acc_stderr\": 0.020423955354778034,\n\
\ \"acc_norm\": 0.8395061728395061,\n \"acc_norm_stderr\": 0.020423955354778034\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5780141843971631,\n \"acc_stderr\": 0.0294621892333706,\n \
\ \"acc_norm\": 0.5780141843971631,\n \"acc_norm_stderr\": 0.0294621892333706\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5795306388526728,\n\
\ \"acc_stderr\": 0.012607654553832703,\n \"acc_norm\": 0.5795306388526728,\n\
\ \"acc_norm_stderr\": 0.012607654553832703\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8125,\n \"acc_stderr\": 0.023709788253811766,\n \
\ \"acc_norm\": 0.8125,\n \"acc_norm_stderr\": 0.023709788253811766\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.815359477124183,\n \"acc_stderr\": 0.01569702924075778,\n \
\ \"acc_norm\": 0.815359477124183,\n \"acc_norm_stderr\": 0.01569702924075778\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8326530612244898,\n \"acc_stderr\": 0.02389714476891452,\n\
\ \"acc_norm\": 0.8326530612244898,\n \"acc_norm_stderr\": 0.02389714476891452\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9154228855721394,\n\
\ \"acc_stderr\": 0.019675343217199177,\n \"acc_norm\": 0.9154228855721394,\n\
\ \"acc_norm_stderr\": 0.019675343217199177\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.94,\n \"acc_stderr\": 0.02386832565759418,\n \
\ \"acc_norm\": 0.94,\n \"acc_norm_stderr\": 0.02386832565759418\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366152,\n\
\ \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366152\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5152998776009792,\n\
\ \"mc1_stderr\": 0.0174953044731879,\n \"mc2\": 0.6733433696722777,\n\
\ \"mc2_stderr\": 0.014930500543970958\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8587213891081295,\n \"acc_stderr\": 0.009789206625044573\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6899166034874905,\n \
\ \"acc_stderr\": 0.012740305717376268\n }\n}\n```"
repo_url: https://huggingface.co/JaeyeonKang/CCK_Asura_v2.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|arc:challenge|25_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|gsm8k|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hellaswag|10_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T16-16-03.001484.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T16-16-03.001484.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- '**/details_harness|winogrande|5_2024-02-11T16-16-03.001484.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-11T16-16-03.001484.parquet'
- config_name: results
data_files:
- split: 2024_02_11T16_16_03.001484
path:
- results_2024-02-11T16-16-03.001484.parquet
- split: latest
path:
- results_2024-02-11T16-16-03.001484.parquet
---
# Dataset Card for Evaluation run of JaeyeonKang/CCK_Asura_v2.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [JaeyeonKang/CCK_Asura_v2.1](https://huggingface.co/JaeyeonKang/CCK_Asura_v2.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v2.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T16:16:03.001484](https://huggingface.co/datasets/open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v2.1/blob/main/results_2024-02-11T16-16-03.001484.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7485227705373686,
"acc_stderr": 0.028690850662240704,
"acc_norm": 0.7515411094238932,
"acc_norm_stderr": 0.02924740008798966,
"mc1": 0.5152998776009792,
"mc1_stderr": 0.0174953044731879,
"mc2": 0.6733433696722777,
"mc2_stderr": 0.014930500543970958
},
"harness|arc:challenge|25": {
"acc": 0.6766211604095563,
"acc_stderr": 0.013669421630012127,
"acc_norm": 0.7252559726962458,
"acc_norm_stderr": 0.013044617212771227
},
"harness|hellaswag|10": {
"acc": 0.7064329814777933,
"acc_stderr": 0.004544651976040094,
"acc_norm": 0.8874726150169289,
"acc_norm_stderr": 0.0031536835304090366
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6888888888888889,
"acc_stderr": 0.03999262876617721,
"acc_norm": 0.6888888888888889,
"acc_norm_stderr": 0.03999262876617721
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8486842105263158,
"acc_stderr": 0.029162631596843982,
"acc_norm": 0.8486842105263158,
"acc_norm_stderr": 0.029162631596843982
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7811320754716982,
"acc_stderr": 0.02544786382510861,
"acc_norm": 0.7811320754716982,
"acc_norm_stderr": 0.02544786382510861
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8958333333333334,
"acc_stderr": 0.02554523921025691,
"acc_norm": 0.8958333333333334,
"acc_norm_stderr": 0.02554523921025691
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7630057803468208,
"acc_stderr": 0.032424147574830975,
"acc_norm": 0.7630057803468208,
"acc_norm_stderr": 0.032424147574830975
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.049665709039785295,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.049665709039785295
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7276595744680852,
"acc_stderr": 0.029101290698386715,
"acc_norm": 0.7276595744680852,
"acc_norm_stderr": 0.029101290698386715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7379310344827587,
"acc_stderr": 0.03664666337225257,
"acc_norm": 0.7379310344827587,
"acc_norm_stderr": 0.03664666337225257
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5529100529100529,
"acc_stderr": 0.025606723995777025,
"acc_norm": 0.5529100529100529,
"acc_norm_stderr": 0.025606723995777025
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5396825396825397,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.5396825396825397,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8838709677419355,
"acc_stderr": 0.018225757949432302,
"acc_norm": 0.8838709677419355,
"acc_norm_stderr": 0.018225757949432302
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6650246305418719,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.6650246305418719,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9141414141414141,
"acc_stderr": 0.01996022556317289,
"acc_norm": 0.9141414141414141,
"acc_norm_stderr": 0.01996022556317289
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9481865284974094,
"acc_stderr": 0.01599622932024412,
"acc_norm": 0.9481865284974094,
"acc_norm_stderr": 0.01599622932024412
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7897435897435897,
"acc_stderr": 0.020660597485026935,
"acc_norm": 0.7897435897435897,
"acc_norm_stderr": 0.020660597485026935
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.03038416923235082,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.03038416923235082
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8613445378151261,
"acc_stderr": 0.02244826447683258,
"acc_norm": 0.8613445378151261,
"acc_norm_stderr": 0.02244826447683258
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5099337748344371,
"acc_stderr": 0.04081677107248436,
"acc_norm": 0.5099337748344371,
"acc_norm_stderr": 0.04081677107248436
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9174311926605505,
"acc_stderr": 0.011800361363016569,
"acc_norm": 0.9174311926605505,
"acc_norm_stderr": 0.011800361363016569
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.7175925925925926,
"acc_stderr": 0.030701372111510934,
"acc_norm": 0.7175925925925926,
"acc_norm_stderr": 0.030701372111510934
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.01831885585008968,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.01831885585008968
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.01926932302564028,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.01926932302564028
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8116591928251121,
"acc_stderr": 0.02624113299640726,
"acc_norm": 0.8116591928251121,
"acc_norm_stderr": 0.02624113299640726
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8549618320610687,
"acc_stderr": 0.03088466108951538,
"acc_norm": 0.8549618320610687,
"acc_norm_stderr": 0.03088466108951538
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9256198347107438,
"acc_stderr": 0.02395268883667674,
"acc_norm": 0.9256198347107438,
"acc_norm_stderr": 0.02395268883667674
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243631,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243631
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8282208588957055,
"acc_stderr": 0.02963471727237103,
"acc_norm": 0.8282208588957055,
"acc_norm_stderr": 0.02963471727237103
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6071428571428571,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.6071428571428571,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.8737864077669902,
"acc_stderr": 0.03288180278808629,
"acc_norm": 0.8737864077669902,
"acc_norm_stderr": 0.03288180278808629
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9188034188034188,
"acc_stderr": 0.01789378490401853,
"acc_norm": 0.9188034188034188,
"acc_norm_stderr": 0.01789378490401853
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8914431673052363,
"acc_stderr": 0.011124283175851181,
"acc_norm": 0.8914431673052363,
"acc_norm_stderr": 0.011124283175851181
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8294797687861272,
"acc_stderr": 0.020247961569303728,
"acc_norm": 0.8294797687861272,
"acc_norm_stderr": 0.020247961569303728
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6491620111731844,
"acc_stderr": 0.015961036675230973,
"acc_norm": 0.6491620111731844,
"acc_norm_stderr": 0.015961036675230973
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8169934640522876,
"acc_stderr": 0.022140767512880973,
"acc_norm": 0.8169934640522876,
"acc_norm_stderr": 0.022140767512880973
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8231511254019293,
"acc_stderr": 0.0216700588855108,
"acc_norm": 0.8231511254019293,
"acc_norm_stderr": 0.0216700588855108
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8395061728395061,
"acc_stderr": 0.020423955354778034,
"acc_norm": 0.8395061728395061,
"acc_norm_stderr": 0.020423955354778034
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5780141843971631,
"acc_stderr": 0.0294621892333706,
"acc_norm": 0.5780141843971631,
"acc_norm_stderr": 0.0294621892333706
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5795306388526728,
"acc_stderr": 0.012607654553832703,
"acc_norm": 0.5795306388526728,
"acc_norm_stderr": 0.012607654553832703
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8125,
"acc_stderr": 0.023709788253811766,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.023709788253811766
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.815359477124183,
"acc_stderr": 0.01569702924075778,
"acc_norm": 0.815359477124183,
"acc_norm_stderr": 0.01569702924075778
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8326530612244898,
"acc_stderr": 0.02389714476891452,
"acc_norm": 0.8326530612244898,
"acc_norm_stderr": 0.02389714476891452
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9154228855721394,
"acc_stderr": 0.019675343217199177,
"acc_norm": 0.9154228855721394,
"acc_norm_stderr": 0.019675343217199177
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.94,
"acc_stderr": 0.02386832565759418,
"acc_norm": 0.94,
"acc_norm_stderr": 0.02386832565759418
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.024648068961366152,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.024648068961366152
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5152998776009792,
"mc1_stderr": 0.0174953044731879,
"mc2": 0.6733433696722777,
"mc2_stderr": 0.014930500543970958
},
"harness|winogrande|5": {
"acc": 0.8587213891081295,
"acc_stderr": 0.009789206625044573
},
"harness|gsm8k|5": {
"acc": 0.6899166034874905,
"acc_stderr": 0.012740305717376268
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
MetamatSoul/Souls | ---
license: unknown
---
|
dhuynh95/Magicoder-Evol-Instruct-5000-Deepseek-tokenized-0.5 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 12046086
num_examples: 5000
download_size: 5483108
dataset_size: 12046086
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Unix95/mach44 | ---
license: openrail
---
|
BadreddineHug/badreddine_llama_data | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: category
dtype:
class_label:
names:
'0': general_qa
'1': closed_qa
'2': open_qa
'3': classification
'4': information_extraction
splits:
- name: train
num_bytes: 99456
num_examples: 64
download_size: 12865
dataset_size: 99456
---
# Dataset Card for "badreddine_llama_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
makram93/rejected_pairs_base | ---
dataset_info:
features:
- name: url
dtype: string
- name: doc_id
dtype: string
- name: original_title
sequence: string
- name: right
dtype: string
- name: left
dtype: string
splits:
- name: train
num_bytes: 88447.0623234648
num_examples: 100
download_size: 0
dataset_size: 88447.0623234648
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "rejected_pairs_base"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anzorq/kbd_monolingual | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
- name: meta
struct:
- name: source
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 157956610
num_examples: 18141
download_size: 71398445
dataset_size: 157956610
---
# Dataset Card for "kbd_monolingual"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sgoedecke/powerful-owl-birdcalls | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float32
- name: sampling_rate
dtype: int64
- name: label
dtype: string
splits:
- name: train
num_bytes: 104468218
num_examples: 270
download_size: 67458981
dataset_size: 104468218
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gradio/custom-component-gallery-backups | ---
license: mit
---
|
open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-355M | ---
pretty_name: Evaluation run of nicholasKluge/Aira-Instruct-355M
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nicholasKluge/Aira-Instruct-355M](https://huggingface.co/nicholasKluge/Aira-Instruct-355M)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-355M\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-10T09:16:32.685819](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-355M/blob/main/results_2023-08-10T09%3A16%3A32.685819.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26193708195623533,\n\
\ \"acc_stderr\": 0.03182336083684077,\n \"acc_norm\": 0.263783264473725,\n\
\ \"acc_norm_stderr\": 0.03183912280555913,\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.015127427096520674,\n \"mc2\": 0.4107912986493598,\n\
\ \"mc2_stderr\": 0.014545912502288488\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.23890784982935154,\n \"acc_stderr\": 0.012461071376316621,\n\
\ \"acc_norm\": 0.28668941979522183,\n \"acc_norm_stderr\": 0.013214986329274765\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3311093407687712,\n\
\ \"acc_stderr\": 0.004696505101217406,\n \"acc_norm\": 0.39225253933479387,\n\
\ \"acc_norm_stderr\": 0.004872546302641832\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.23026315789473684,\n \"acc_stderr\": 0.034260594244031654,\n\
\ \"acc_norm\": 0.23026315789473684,\n \"acc_norm_stderr\": 0.034260594244031654\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27169811320754716,\n \"acc_stderr\": 0.027377706624670716,\n\
\ \"acc_norm\": 0.27169811320754716,\n \"acc_norm_stderr\": 0.027377706624670716\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.225531914893617,\n \"acc_stderr\": 0.027321078417387536,\n\
\ \"acc_norm\": 0.225531914893617,\n \"acc_norm_stderr\": 0.027321078417387536\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n\
\ \"acc_stderr\": 0.0383515395439942,\n \"acc_norm\": 0.21052631578947367,\n\
\ \"acc_norm_stderr\": 0.0383515395439942\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707842,\n\
\ \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707842\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708617,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708617\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1746031746031746,\n\
\ \"acc_stderr\": 0.03395490020856113,\n \"acc_norm\": 0.1746031746031746,\n\
\ \"acc_norm_stderr\": 0.03395490020856113\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24516129032258063,\n\
\ \"acc_stderr\": 0.02447224384089553,\n \"acc_norm\": 0.24516129032258063,\n\
\ \"acc_norm_stderr\": 0.02447224384089553\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.30049261083743845,\n \"acc_stderr\": 0.03225799476233483,\n\
\ \"acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.03225799476233483\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2787878787878788,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.2787878787878788,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2474747474747475,\n \"acc_stderr\": 0.030746300742124505,\n \"\
acc_norm\": 0.2474747474747475,\n \"acc_norm_stderr\": 0.030746300742124505\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.26424870466321243,\n \"acc_stderr\": 0.03182155050916647,\n\
\ \"acc_norm\": 0.26424870466321243,\n \"acc_norm_stderr\": 0.03182155050916647\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2948717948717949,\n \"acc_stderr\": 0.023119362758232287,\n\
\ \"acc_norm\": 0.2948717948717949,\n \"acc_norm_stderr\": 0.023119362758232287\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3376146788990826,\n\
\ \"acc_stderr\": 0.020275265986638903,\n \"acc_norm\": 0.3376146788990826,\n\
\ \"acc_norm_stderr\": 0.020275265986638903\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.028353212866863438,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.028353212866863438\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24509803921568626,\n \"acc_stderr\": 0.030190282453501936,\n \"\
acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.030190282453501936\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.25738396624472576,\n \"acc_stderr\": 0.028458820991460305,\n \
\ \"acc_norm\": 0.25738396624472576,\n \"acc_norm_stderr\": 0.028458820991460305\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.26905829596412556,\n\
\ \"acc_stderr\": 0.029763779406874972,\n \"acc_norm\": 0.26905829596412556,\n\
\ \"acc_norm_stderr\": 0.029763779406874972\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.21374045801526717,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.21374045801526717,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.35537190082644626,\n \"acc_stderr\": 0.04369236326573981,\n \"\
acc_norm\": 0.35537190082644626,\n \"acc_norm_stderr\": 0.04369236326573981\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.28703703703703703,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.28703703703703703,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340455,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340455\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1650485436893204,\n \"acc_stderr\": 0.036756688322331886,\n\
\ \"acc_norm\": 0.1650485436893204,\n \"acc_norm_stderr\": 0.036756688322331886\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.02934311479809446,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.02934311479809446\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.14,\n \"acc_stderr\": 0.03487350880197768,\n \
\ \"acc_norm\": 0.14,\n \"acc_norm_stderr\": 0.03487350880197768\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.3001277139208174,\n\
\ \"acc_stderr\": 0.01638924969131741,\n \"acc_norm\": 0.3001277139208174,\n\
\ \"acc_norm_stderr\": 0.01638924969131741\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399201,\n\
\ \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399201\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.238562091503268,\n \"acc_stderr\": 0.02440439492808787,\n\
\ \"acc_norm\": 0.238562091503268,\n \"acc_norm_stderr\": 0.02440439492808787\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.29260450160771706,\n\
\ \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.29260450160771706,\n\
\ \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.024659685185967284,\n\
\ \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.024659685185967284\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2730496453900709,\n \"acc_stderr\": 0.026577860943307857,\n \
\ \"acc_norm\": 0.2730496453900709,\n \"acc_norm_stderr\": 0.026577860943307857\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24119947848761408,\n\
\ \"acc_stderr\": 0.01092649610203495,\n \"acc_norm\": 0.24119947848761408,\n\
\ \"acc_norm_stderr\": 0.01092649610203495\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.028418208619406794,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.028418208619406794\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24673202614379086,\n \"acc_stderr\": 0.0174408203674025,\n \
\ \"acc_norm\": 0.24673202614379086,\n \"acc_norm_stderr\": 0.0174408203674025\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\
\ \"acc_stderr\": 0.040139645540727735,\n \"acc_norm\": 0.22727272727272727,\n\
\ \"acc_norm_stderr\": 0.040139645540727735\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24081632653061225,\n \"acc_stderr\": 0.027372942201788163,\n\
\ \"acc_norm\": 0.24081632653061225,\n \"acc_norm_stderr\": 0.027372942201788163\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.030360490154014652,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.030360490154014652\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25903614457831325,\n\
\ \"acc_stderr\": 0.034106466140718564,\n \"acc_norm\": 0.25903614457831325,\n\
\ \"acc_norm_stderr\": 0.034106466140718564\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.035650796707083106,\n\
\ \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.035650796707083106\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.015127427096520674,\n \"mc2\": 0.4107912986493598,\n\
\ \"mc2_stderr\": 0.014545912502288488\n }\n}\n```"
repo_url: https://huggingface.co/nicholasKluge/Aira-Instruct-355M
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|arc:challenge|25_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hellaswag|10_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-10T09:16:32.685819.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-10T09:16:32.685819.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-10T09:16:32.685819.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-10T09:16:32.685819.parquet'
- config_name: results
data_files:
- split: 2023_08_10T09_16_32.685819
path:
- results_2023-08-10T09:16:32.685819.parquet
- split: latest
path:
- results_2023-08-10T09:16:32.685819.parquet
---
# Dataset Card for Evaluation run of nicholasKluge/Aira-Instruct-355M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/nicholasKluge/Aira-Instruct-355M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [nicholasKluge/Aira-Instruct-355M](https://huggingface.co/nicholasKluge/Aira-Instruct-355M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-355M",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-10T09:16:32.685819](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-355M/blob/main/results_2023-08-10T09%3A16%3A32.685819.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26193708195623533,
"acc_stderr": 0.03182336083684077,
"acc_norm": 0.263783264473725,
"acc_norm_stderr": 0.03183912280555913,
"mc1": 0.2484700122399021,
"mc1_stderr": 0.015127427096520674,
"mc2": 0.4107912986493598,
"mc2_stderr": 0.014545912502288488
},
"harness|arc:challenge|25": {
"acc": 0.23890784982935154,
"acc_stderr": 0.012461071376316621,
"acc_norm": 0.28668941979522183,
"acc_norm_stderr": 0.013214986329274765
},
"harness|hellaswag|10": {
"acc": 0.3311093407687712,
"acc_stderr": 0.004696505101217406,
"acc_norm": 0.39225253933479387,
"acc_norm_stderr": 0.004872546302641832
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.23026315789473684,
"acc_stderr": 0.034260594244031654,
"acc_norm": 0.23026315789473684,
"acc_norm_stderr": 0.034260594244031654
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27169811320754716,
"acc_stderr": 0.027377706624670716,
"acc_norm": 0.27169811320754716,
"acc_norm_stderr": 0.027377706624670716
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.225531914893617,
"acc_stderr": 0.027321078417387536,
"acc_norm": 0.225531914893617,
"acc_norm_stderr": 0.027321078417387536
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0383515395439942,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0383515395439942
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708617,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708617
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1746031746031746,
"acc_stderr": 0.03395490020856113,
"acc_norm": 0.1746031746031746,
"acc_norm_stderr": 0.03395490020856113
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24516129032258063,
"acc_stderr": 0.02447224384089553,
"acc_norm": 0.24516129032258063,
"acc_norm_stderr": 0.02447224384089553
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.03225799476233483,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.03225799476233483
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2787878787878788,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.2787878787878788,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2474747474747475,
"acc_stderr": 0.030746300742124505,
"acc_norm": 0.2474747474747475,
"acc_norm_stderr": 0.030746300742124505
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.26424870466321243,
"acc_stderr": 0.03182155050916647,
"acc_norm": 0.26424870466321243,
"acc_norm_stderr": 0.03182155050916647
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2948717948717949,
"acc_stderr": 0.023119362758232287,
"acc_norm": 0.2948717948717949,
"acc_norm_stderr": 0.023119362758232287
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3376146788990826,
"acc_stderr": 0.020275265986638903,
"acc_norm": 0.3376146788990826,
"acc_norm_stderr": 0.020275265986638903
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.028353212866863438,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.028353212866863438
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.030190282453501936,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.030190282453501936
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25738396624472576,
"acc_stderr": 0.028458820991460305,
"acc_norm": 0.25738396624472576,
"acc_norm_stderr": 0.028458820991460305
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.26905829596412556,
"acc_stderr": 0.029763779406874972,
"acc_norm": 0.26905829596412556,
"acc_norm_stderr": 0.029763779406874972
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.21374045801526717,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.21374045801526717,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.35537190082644626,
"acc_stderr": 0.04369236326573981,
"acc_norm": 0.35537190082644626,
"acc_norm_stderr": 0.04369236326573981
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340455,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340455
},
"harness|hendrycksTest-management|5": {
"acc": 0.1650485436893204,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.1650485436893204,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02934311479809446,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02934311479809446
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.14,
"acc_stderr": 0.03487350880197768,
"acc_norm": 0.14,
"acc_norm_stderr": 0.03487350880197768
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.3001277139208174,
"acc_stderr": 0.01638924969131741,
"acc_norm": 0.3001277139208174,
"acc_norm_stderr": 0.01638924969131741
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24566473988439305,
"acc_stderr": 0.02317629820399201,
"acc_norm": 0.24566473988439305,
"acc_norm_stderr": 0.02317629820399201
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.238562091503268,
"acc_stderr": 0.02440439492808787,
"acc_norm": 0.238562091503268,
"acc_norm_stderr": 0.02440439492808787
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.29260450160771706,
"acc_stderr": 0.025839898334877983,
"acc_norm": 0.29260450160771706,
"acc_norm_stderr": 0.025839898334877983
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.024659685185967284,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.024659685185967284
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2730496453900709,
"acc_stderr": 0.026577860943307857,
"acc_norm": 0.2730496453900709,
"acc_norm_stderr": 0.026577860943307857
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24119947848761408,
"acc_stderr": 0.01092649610203495,
"acc_norm": 0.24119947848761408,
"acc_norm_stderr": 0.01092649610203495
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.028418208619406794,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.028418208619406794
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24673202614379086,
"acc_stderr": 0.0174408203674025,
"acc_norm": 0.24673202614379086,
"acc_norm_stderr": 0.0174408203674025
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.040139645540727735,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.040139645540727735
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24081632653061225,
"acc_stderr": 0.027372942201788163,
"acc_norm": 0.24081632653061225,
"acc_norm_stderr": 0.027372942201788163
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014652,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014652
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25903614457831325,
"acc_stderr": 0.034106466140718564,
"acc_norm": 0.25903614457831325,
"acc_norm_stderr": 0.034106466140718564
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.035650796707083106,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.035650796707083106
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2484700122399021,
"mc1_stderr": 0.015127427096520674,
"mc2": 0.4107912986493598,
"mc2_stderr": 0.014545912502288488
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-college_chemistry-neg-prepend-fix | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 6203
num_examples: 5
- name: test
num_bytes: 253144
num_examples: 100
download_size: 14388
dataset_size: 259347
---
# Dataset Card for "mmlu-college_chemistry-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
presencesw/ecqa | ---
dataset_info:
features:
- name: question
dtype: string
- name: choice
dtype: string
- name: answer
dtype: string
- name: cot
dtype: string
splits:
- name: train
num_bytes: 2003413
num_examples: 7112
download_size: 1242168
dataset_size: 2003413
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
giulioappetito/churn_dataset_giulioappetito | ---
license: gpl
---
|
nc33/qapair | ---
dataset_info:
config_name: qna
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: full_answer
dtype: string
- name: FaQ
dtype: string
splits:
- name: train
num_bytes: 2500462492
num_examples: 449423
download_size: 450150169
dataset_size: 2500462492
configs:
- config_name: qna
data_files:
- split: train
path: qna/train-*
---
# Dataset Card for "qapair"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_clibrain__Llama-2-ft-instruct-es | ---
pretty_name: Evaluation run of clibrain/Llama-2-ft-instruct-es
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [clibrain/Llama-2-ft-instruct-es](https://huggingface.co/clibrain/Llama-2-ft-instruct-es)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_clibrain__Llama-2-ft-instruct-es\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-02T16:47:05.366390](https://huggingface.co/datasets/open-llm-leaderboard/details_clibrain__Llama-2-ft-instruct-es/blob/main/results_2023-12-02T16-47-05.366390.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.0,\n \"\
acc_stderr\": 0.0\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \
\ \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/clibrain/Llama-2-ft-instruct-es
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|arc:challenge|25_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T17_59_02.863865
path:
- '**/details_harness|drop|3_2023-09-17T17-59-02.863865.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T17-59-02.863865.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T17_59_02.863865
path:
- '**/details_harness|gsm8k|5_2023-09-17T17-59-02.863865.parquet'
- split: 2023_12_02T16_47_05.366390
path:
- '**/details_harness|gsm8k|5_2023-12-02T16-47-05.366390.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-02T16-47-05.366390.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hellaswag|10_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T17_59_02.863865
path:
- '**/details_harness|winogrande|5_2023-09-17T17-59-02.863865.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T17-59-02.863865.parquet'
- config_name: results
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- results_2023-08-25T19:36:08.180753.parquet
- split: 2023_09_17T17_59_02.863865
path:
- results_2023-09-17T17-59-02.863865.parquet
- split: 2023_12_02T16_47_05.366390
path:
- results_2023-12-02T16-47-05.366390.parquet
- split: latest
path:
- results_2023-12-02T16-47-05.366390.parquet
---
# Dataset Card for Evaluation run of clibrain/Llama-2-ft-instruct-es
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/clibrain/Llama-2-ft-instruct-es
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [clibrain/Llama-2-ft-instruct-es](https://huggingface.co/clibrain/Llama-2-ft-instruct-es) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_clibrain__Llama-2-ft-instruct-es",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-02T16:47:05.366390](https://huggingface.co/datasets/open-llm-leaderboard/details_clibrain__Llama-2-ft-instruct-es/blob/main/results_2023-12-02T16-47-05.366390.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
jjzha/kompetencer | ---
license: cc-by-4.0
language: da
---
This is the Kompetencer dataset created by:
```
@inproceedings{zhang-etal-2022-kompetencer,
title = "Kompetencer: Fine-grained Skill Classification in {D}anish Job Postings via Distant Supervision and Transfer Learning",
author = "Zhang, Mike and
Jensen, Kristian N{\o}rgaard and
Plank, Barbara",
booktitle = "Proceedings of the Thirteenth Language Resources and Evaluation Conference",
month = jun,
year = "2022",
address = "Marseille, France",
publisher = "European Language Resources Association",
url = "https://aclanthology.org/2022.lrec-1.46",
pages = "436--447",
}
```
There are document delimiters indicated by `idx`.
Number of samples (sentences):
- train: 778
- dev: 346
- test: 262
Sources:
- STAR (house)
Type of tags:
- Generic BIO tags with keys `tags_skill` and `tags_knowledge`
Sample:
```
{
"idx": 1,
"tokens": ["Du", "skal", "s\u00e6tte", "dagsordenen", "v\u00e6re", "v\u00e6rdiskabende", "og", "levere", "skarpt", "fagligt", "og", "strategisk", "med-", "og", "modspil", "."],
"tags_skill": ["O", "O", "B", "I", "B", "I", "O", "B", "I", "I", "I", "I", "I", "I", "I", "I"],
"tags_knowledge": ["O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O"]
}
``` |
liuyanchen1015/MULTI_VALUE_mnli_for_to | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 798882
num_examples: 3309
- name: dev_mismatched
num_bytes: 910879
num_examples: 3659
- name: test_matched
num_bytes: 767466
num_examples: 3133
- name: test_mismatched
num_bytes: 891150
num_examples: 3609
- name: train
num_bytes: 31803861
num_examples: 129455
download_size: 22146520
dataset_size: 35172238
---
# Dataset Card for "MULTI_VALUE_mnli_for_to"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jay401521/label1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: int64
- name: domain
dtype: string
- name: label
dtype: int64
- name: rank
dtype: int64
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 922790.3333333334
num_examples: 10007
download_size: 475363
dataset_size: 922790.3333333334
---
# Dataset Card for "label1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mii-llm/cruciverba | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 6790785
num_examples: 41793
download_size: 2407727
dataset_size: 6790785
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "cruciverba"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
micsell/hebrew_keywords2 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: label
dtype:
class_label:
names:
'0': daateh_F
'1': daateh_M
'2': gadalt_F
'3': gadalt_M
'4': hona_F
'5': hona_M
'6': lah_F
'7': lah_M
'8': osa_F
'9': osa_M
'10': otah_F
'11': otah_M
'12': roza_F
'13': roza_M
'14': shelah_F
'15': shelah_M
- name: id
dtype: string
splits:
- name: train
num_bytes: 36433567.02614379
num_examples: 413
- name: test
num_bytes: 4423662.973856209
num_examples: 46
download_size: 36698185
dataset_size: 40857230.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
wj00037/1233 | ---
license: apache-2.0
---
|
mask-distilled-one-sec-cv12/chunk_154 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1163379424
num_examples: 228472
download_size: 1182358065
dataset_size: 1163379424
---
# Dataset Card for "chunk_154"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
re2panda/click_bate_article_train_val | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: output
dtype: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 603257228.0105588
num_examples: 195362
- name: validation
num_bytes: 67028923.98944115
num_examples: 21707
download_size: 387271308
dataset_size: 670286152.0
---
# Dataset Card for "click_bate_article_train_val"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
htriedman/wikidb | ---
dataset_info:
features:
- name: title
dtype: string
- name: description
dtype: string
- name: query
dtype: string
- name: extra_info
dtype: string
- name: wikidb
dtype: string
splits:
- name: train
num_bytes: 28659407
num_examples: 25555
download_size: 8339728
dataset_size: 28659407
---
# Dataset Card for "wikidb"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yangwang825/sst2-textfooler-7 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: augment
dtype: string
splits:
- name: train
num_bytes: 7161080
num_examples: 54359
- name: validation
num_bytes: 110096
num_examples: 872
- name: test
num_bytes: 226340
num_examples: 1821
download_size: 2029077
dataset_size: 7497516
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
jonathan-roberts1/MultiScene | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
sequence:
class_label:
names:
'0': apron
'1': baseball field
'2': basketball field
'3': beach
'4': bridge
'5': cemetery
'6': commercial
'7': farmland
'8': woodland
'9': golf course
'10': greenhouse
'11': helipad
'12': lake or pond
'13': oil field
'14': orchard
'15': parking lot
'16': park
'17': pier
'18': port
'19': quarry
'20': railway
'21': residential
'22': river
'23': roundabout
'24': runway
'25': soccer
'26': solar panel
'27': sparse shrub
'28': stadium
'29': storage tank
'30': tennis court
'31': train station
'32': wastewater plant
'33': wind turbine
'34': works
'35': sea
splits:
- name: train
num_bytes: 867506522
num_examples: 14000
download_size: 867005851
dataset_size: 867506522
license: mit
task_categories:
- image-classification
- zero-shot-image-classification
---
# Dataset Card for "MultiScene"
## Dataset Description
- **Paper** [MultiScene: A Large-scale Dataset and Benchmark for Multi-scene Recognition in Single Aerial Images](https://ieeexplore.ieee.org/iel7/36/4358825/09537917.pdf)
- **Split** Clean
### Split Information
This HuggingFace dataset repository contains just the 'Clean' split.
### Licensing Information
MIT.
## Citation Information
[MultiScene: A Large-scale Dataset and Benchmark for Multi-scene Recognition in Single Aerial Images](https://ieeexplore.ieee.org/iel7/36/4358825/09537917.pdf)
```
@article{hua2021multiscene,
title = {MultiScene: A Large-scale Dataset and Benchmark for Multi-scene Recognition in Single Aerial Images},
author = {Hua, Y. and Mou, L. and Jin, P. and Zhu, X. X.},
year = {in press},
journal = {IEEE Transactions on Geoscience and Remote Sensing}
}
``` |
NAYEIRN23/CASOSPJ2 | ---
license: llama2
---
|
vishnun0027/billsum_dataset | ---
dataset_info:
features:
- name: text
dtype: string
- name: summary
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 272407638
num_examples: 23455
download_size: 113741070
dataset_size: 272407638
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
EleutherAI/quirky_authors_bob_easy | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: difficulty
dtype: float64
- name: statement
dtype: string
- name: choices
sequence: string
- name: character
dtype: string
- name: label
dtype: bool
splits:
- name: train
num_bytes: 341746.06616247364
num_examples: 2429
- name: validation
num_bytes: 68175.393
num_examples: 484
- name: test
num_bytes: 65741.3675
num_examples: 470
download_size: 209896
dataset_size: 475662.8266624736
---
# Dataset Card for "quirky_authors_bob_easy"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
caiosoares26/zevaqueiro | ---
license: openrail
---
|
alexgshaw/llama-65b-tokenized-wikitext-2-v1 | ---
dataset_info:
features:
- name: text
dtype: string
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 27888212
num_examples: 36718
download_size: 11634178
dataset_size: 27888212
---
# Dataset Card for "llama-65b-tokenized-wikitext-2-v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
W1lson/dataset1 | ---
dataset_info:
features:
- name: data
list:
- name: title
dtype: string
- name: paragraphs
list:
- name: context
dtype: string
- name: qas
list:
- name: question
dtype: string
- name: id
dtype: string
- name: answers
list:
- name: text
dtype: string
- name: answer_start
dtype: int64
splits:
- name: train
num_bytes: 557
num_examples: 1
download_size: 5526
dataset_size: 557
---
# Dataset Card for "dataset1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JairoDanielMT/AppleQualitySorter | ---
license: mit
---
|
coastalcph/euandi_2019 | ---
license: cc-by-nc-sa-4.0
language:
- en
- de
- fr
- es
- it
- el
pretty_name: EUANDI
size_categories:
- n<1K
tags:
- politics
---
# Dataset Description
"EU and I" (EUANDI) was a voting assistance project, implemented and documented by [Michel et al. (2019)](https://cadmus.eui.eu/handle/1814/63870). EUANDI was publicly released before the 2019 EU election,
to help EU citizens find their affinity to candidate national parties. In [Chalkidis and Brandl (2024)](https://arxiv.org/abs/2403.13592), we re-distribute and experiment with the following resources:
- The EUANDI questionnaire comprises 22 questions in the form of a political statement followed by 5 options from complete disagreement to complete agreement. The questions are classified into 7 thematic topics and
2 political categories (Left/Right, and Anti-EU/Pro-EU). We provide the questions in 5 languages (en, de, fr, it, es, el).
- The positions of the parties from the EUANDI 2019 dataset. The positions are in the form of a party's short answer, and a position (justification)
on the statement. We also include translated versions by Google Translate and Mixtral from the original language to English. We provide the positions for the top 5 parties
from 10 EU member states (Germany, France, Italy, Spain, Poland, Romania, Netherlands, Hungary, Portugal, and Greece).
# Data Instances
Example of data instance from the EUANDI questionnaire (`questionnaire`):
```
{'statement': {
'en': 'Social programmes should be maintained even at the cost of higher taxes.',
'de': 'Sozialstaatliche Leistungen sollten erhalten bleiben, selbst wenn dies zu höheren Steuern führt.',
'el': 'Τα προγράμματα κοινωνικής πρόνοιας πρέπει να διατηρηθούν ακόμα και αν οι φόροι πρέπει να αυξηθούν.',
'es': 'Las políticas sociales deberían mantenerse aunque eso implique una subida de impuestos.',
'fr': 'L’aide sociale devrait être maintenue même si pour cela les impôts doivent être augmentés.',
'it': 'I programmi per le politiche sociali dovrebbero essere mantenuti anche a costo di tasse pìu alte.'
},
'Liberal society': 0, 'Environmental protection': 0, 'EU integration': 0, 'Economic liberalization': -1,
'Finance restrictions': -1, 'Immigration restrictions': 0, 'Law and Order': 0, 'Left/Right': -1, 'Anti-EU/Pro-EU': 0
}
```
Example of data instance from the parties' positions (`party_positions`):
```
{'party_name': 'Grünen',
'full_party_name': 'Die Grünen',
'euro_party': 'Greens/EFA',
'country_iso': 'de',
'statement_1': {
'position': 'Allen Menschen in Europa wollen wir ein würdevolles Existenzmini-mum garantieren. [...]',
'translated_position_google': 'We want to guarantee a dignified existence mini mum to all people in Europe. [...]',
'translated_position_mixtral': 'We want to guarantee a dignified existence minimum for all people in Europe. [...]'',
'answer': 1
}
...
'statement_22'{
'position': 'Für die Europawahlen unterstützen wir weiterhin das Prinzip der europäischen Spitzenkandidat*innen und transnationalen Listen. [...]',
'translated_position_google': 'For the European elections, we continue to support the principle of top European candidates and transnational lists. [...]',
'translated_position_mixtral': 'For the European elections, we continue to support the principle of European top candidate(s) and transnational lists. [...]',
'answer': 1
}
}
```
# How to Use
```python
from datasets import load_dataset
# Load the EUANDI questionnaire
euandi_questionnaire = load_dataset('coastalcph/euandi_2019'), 'questionnaire')
# Load the EUANDI parties' positions
euandi_party_positions = load_dataset('coastalcph/euandi_2019'), 'party_positions')
```
# Citations
For the original creator: *[euandi2019 : project description and datasets documentation (Michel et al., 2019)](https://cadmus.eui.eu/handle/1814/63870)*:
```
@article{euandi_2019,
author = {Michel, Elie and Cicchi,
Lorenzo and Garzia,
Diego and Ferreira da Silva,
Frederico and Trechsel, Alexander},
year = {2019},
month = {01},
title = {euandi2019: Project Description and Datasets Documentation},
journal = {SSRN Electronic Journal},
doi = {10.2139/ssrn.3446677}
```
For our work, redistributing and augmenting the datasets: *[Llama meets EU: Investigating the European political spectrum through the lens of LLMs.
Ilias Chalkidis and Stephanie Brandl.
In the Proceedings of the Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL),
Mexico City, Mexico, June 16–21, 2024.](https://arxiv.org/abs/2403.13592)*
```
@inproceedings{chalkidis-and-brandl-eu-llama-2024,
title = "Llama meets EU: Investigating the European political spectrum through the lens of LLMs",
author = "Chalkidis, Ilias and Brandl, Stephanie",
booktitle = "Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics",
month = jun,
year = "2024",
address = "Mexico City, Mexico",
publisher = "Association for Computational Linguistics",
}
```
We highly recommend citing both aforementioned publications when working with the datasets. |
furry-br/zoey | ---
license: openrail
---
|
HoangHa/medical_bench_raw | ---
dataset_info:
features:
- name: questions
dtype: string
- name: a
dtype: string
- name: b
dtype: string
- name: c
dtype: string
- name: d
dtype: string
- name: correct_answer
dtype: string
- name: source_link
dtype: string
splits:
- name: train
num_bytes: 1553355
num_examples: 4770
download_size: 618016
dataset_size: 1553355
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Phando/uspto-50k | ---
dataset_info:
features:
- name: class
dtype: int64
- name: id
dtype: string
- name: prod_smiles
dtype: string
- name: rxn_smiles
dtype: string
- name: prod_smiles_pop
dtype: int64
- name: keep
dtype: bool
splits:
- name: train
num_bytes: 22822250.69997601
num_examples: 49015
- name: validation
num_bytes: 466083.3000239923
num_examples: 1001
download_size: 8864323
dataset_size: 23288334.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "uspto-50k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bh8648/split_dataset_14 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: page_num
dtype: int64
splits:
- name: train
num_bytes: 838641
num_examples: 212
download_size: 427931
dataset_size: 838641
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "split_dataset_14"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zhengxuanzenwu/dolly-starter_only | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 1218493
num_examples: 10544
download_size: 741990
dataset_size: 1218493
---
# Dataset Card for "dolly-starter_only"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-one-sec-cv12-each-chunk-uniq/chunk_146 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1051962492.0
num_examples: 204981
download_size: 1075428919
dataset_size: 1051962492.0
---
# Dataset Card for "chunk_146"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Trelis/openassistant-deepseek-coder | ---
license: apache-2.0
language:
- en
- es
- ru
- de
- pl
- th
- vi
- sv
- bn
- da
- he
- it
- fa
- sk
- id
- nb
- el
- nl
- hu
- eu
- zh
- eo
- ja
- ca
- cs
- bg
- fi
- pt
- tr
- ro
- ar
- uk
- gl
- fr
- ko
tags:
- human-feedback
- deepseek coder
size_categories:
- 1K<n<10k
pretty_name: Filtered OpenAssistant Conversations
---
# Chat Fine-tuning Dataset - OpenAssistant DeepSeek Coder
This dataset allows for fine-tuning chat models using:
```
B_INST = '\n### Instruction:\n'
E_INST = '\n### Response:\n'
BOS = '<|begin▁of▁sentence|>'
EOS = '\n<|EOT|>\n'
```
Sample Preparation:
1. The dataset is cloned from [TimDettmers](https://huggingface.co/datasets/timdettmers/openassistant-guanaco), which itself is a subset of the Open Assistant dataset, which you can find [here](https://huggingface.co/datasets/OpenAssistant/oasst1/tree/main). This subset of the data only contains the highest-rated paths in the conversation tree, with a total of 9,846 samples.
1. The dataset was then filtered to:
- replace instances of '### Human:' with 'B_INST'
- replace instances of '### Assistant:' with 'E_INST'
- end assistant responses with the correct EOS.
Details of the root dataset follow, copied from that repo:
# OpenAssistant Conversations Dataset (OASST1)
## Dataset Description
- **Homepage:** https://www.open-assistant.io/
- **Repository:** https://github.com/LAION-AI/Open-Assistant
- **Paper:** https://arxiv.org/abs/2304.07327
### Dataset Summary
In an effort to democratize research on large-scale alignment, we release OpenAssistant
Conversations (OASST1), a human-generated, human-annotated assistant-style conversation
corpus consisting of 161,443 messages in 35 different languages, annotated with 461,292
quality ratings, resulting in over 10,000 fully annotated conversation trees. The corpus
is a product of a worldwide crowd-sourcing effort involving over 13,500 volunteers.
Please refer to our [paper](https://arxiv.org/abs/2304.07327) for further details.
### Dataset Structure
This dataset contains message trees. Each message tree has an initial prompt message as the root node,
which can have multiple child messages as replies, and these child messages can have multiple replies.
All messages have a role property: this can either be "assistant" or "prompter". The roles in
conversation threads from prompt to leaf node strictly alternate between "prompter" and "assistant".
This version of the dataset contains data collected on the [open-assistant.io](https://open-assistant.io/) website until April 12 2023.
### JSON Example: Message
For readability, the following JSON examples are shown formatted with indentation on multiple lines.
Objects are stored without indentation (on single lines) in the actual jsonl files.
```json
{
"message_id": "218440fd-5317-4355-91dc-d001416df62b",
"parent_id": "13592dfb-a6f9-4748-a92c-32b34e239bb4",
"user_id": "8e95461f-5e94-4d8b-a2fb-d4717ce973e4",
"text": "It was the winter of 2035, and artificial intelligence (..)",
"role": "assistant",
"lang": "en",
"review_count": 3,
"review_result": true,
"deleted": false,
"rank": 0,
"synthetic": true,
"model_name": "oasst-sft-0_3000,max_new_tokens=400 (..)",
"labels": {
"spam": { "value": 0.0, "count": 3 },
"lang_mismatch": { "value": 0.0, "count": 3 },
"pii": { "value": 0.0, "count": 3 },
"not_appropriate": { "value": 0.0, "count": 3 },
"hate_speech": { "value": 0.0, "count": 3 },
"sexual_content": { "value": 0.0, "count": 3 },
"quality": { "value": 0.416, "count": 3 },
"toxicity": { "value": 0.16, "count": 3 },
"humor": { "value": 0.0, "count": 3 },
"creativity": { "value": 0.33, "count": 3 },
"violence": { "value": 0.16, "count": 3 }
}
}
```
### JSON Example: Conversation Tree
For readability, only a subset of the message properties is shown here.
```json
{
"message_tree_id": "14fbb664-a620-45ce-bee4-7c519b16a793",
"tree_state": "ready_for_export",
"prompt": {
"message_id": "14fbb664-a620-45ce-bee4-7c519b16a793",
"text": "Why can't we divide by 0? (..)",
"role": "prompter",
"lang": "en",
"replies": [
{
"message_id": "894d30b6-56b4-4605-a504-89dd15d4d1c8",
"text": "The reason we cannot divide by zero is because (..)",
"role": "assistant",
"lang": "en",
"replies": [
// ...
]
},
{
"message_id": "84d0913b-0fd9-4508-8ef5-205626a7039d",
"text": "The reason that the result of a division by zero is (..)",
"role": "assistant",
"lang": "en",
"replies": [
{
"message_id": "3352725e-f424-4e3b-a627-b6db831bdbaa",
"text": "Math is confusing. Like those weird Irrational (..)",
"role": "prompter",
"lang": "en",
"replies": [
{
"message_id": "f46207ca-3149-46e9-a466-9163d4ce499c",
"text": "Irrational numbers are simply numbers (..)",
"role": "assistant",
"lang": "en",
"replies": []
},
// ...
]
}
]
}
]
}
}
```
Please refer to [oasst-data](https://github.com/LAION-AI/Open-Assistant/tree/main/oasst-data) for
details about the data structure and Python code to read and write jsonl files containing oasst data objects.
If you would like to explore the dataset yourself you can find a
[`getting-started`](https://github.com/LAION-AI/Open-Assistant/blob/main/notebooks/openassistant-oasst1/getting-started.ipynb)
notebook in the `notebooks/openassistant-oasst1` folder of the [LAION-AI/Open-Assistant](https://github.com/LAION-AI/Open-Assistant)
github repository.
## Main Dataset Files
Conversation data is provided either as nested messages in trees (extension `.trees.jsonl.gz`)
or as a flat list (table) of messages (extension `.messages.jsonl.gz`).
### Ready For Export Trees
```
2023-04-12_oasst_ready.trees.jsonl.gz 10,364 trees with 88,838 total messages
2023-04-12_oasst_ready.messages.jsonl.gz 88,838 messages
```
Trees in `ready_for_export` state without spam and deleted messages including message labels.
The oasst_ready-trees file usually is sufficient for supervised fine-tuning (SFT) & reward model (RM) training.
### All Trees
```
2023-04-12_oasst_all.trees.jsonl.gz 66,497 trees with 161,443 total messages
2023-04-12_oasst_all.messages.jsonl.gz 161,443 messages
```
All trees, including those in states `prompt_lottery_waiting` (trees that consist of only one message, namely the initial prompt),
`aborted_low_grade` (trees that stopped growing because the messages had low quality), and `halted_by_moderator`.
### Supplemental Exports: Spam & Prompts
```
2023-04-12_oasst_spam.messages.jsonl.gz
```
These are messages which were deleted or have a negative review result (`"review_result": false`).
Besides low quality, a frequent reason for message deletion is a wrong language tag.
```
2023-04-12_oasst_prompts.messages.jsonl.gz
```
These are all the kept initial prompt messages with positive review result (no spam) of trees in `ready_for_export` or `prompt_lottery_waiting` state.
### Using the Huggingface Datasets
While HF datasets is ideal for tabular datasets, it is not a natural fit for nested data structures like the OpenAssistant conversation trees.
Nevertheless, we make all messages which can also be found in the file `2023-04-12_oasst_ready.trees.jsonl.gz` available in parquet as train/validation splits.
These are directly loadable by [Huggingface Datasets](https://pypi.org/project/datasets/).
To load the oasst1 train & validation splits use:
```python
from datasets import load_dataset
ds = load_dataset("OpenAssistant/oasst1")
train = ds['train'] # len(train)=84437 (95%)
val = ds['validation'] # len(val)=4401 (5%)
```
The messages appear in depth-first order of the message trees.
Full conversation trees can be reconstructed from the flat messages table by using the `parent_id`
and `message_id` properties to identify the parent-child relationship of messages. The `message_tree_id`
and `tree_state` properties (only present in flat messages files) can be used to find all messages of a message tree or to select trees by their state.
### Languages
OpenAssistant Conversations incorporates 35 different languages with a distribution of messages as follows:
**Languages with over 1000 messages**
- English: 71956
- Spanish: 43061
- Russian: 9089
- German: 5279
- Chinese: 4962
- French: 4251
- Thai: 3042
- Portuguese (Brazil): 2969
- Catalan: 2260
- Korean: 1553
- Ukrainian: 1352
- Italian: 1320
- Japanese: 1018
<details>
<summary><b>Languages with under 1000 messages</b></summary>
<ul>
<li>Vietnamese: 952</li>
<li>Basque: 947</li>
<li>Polish: 886</li>
<li>Hungarian: 811</li>
<li>Arabic: 666</li>
<li>Dutch: 628</li>
<li>Swedish: 512</li>
<li>Turkish: 454</li>
<li>Finnish: 386</li>
<li>Czech: 372</li>
<li>Danish: 358</li>
<li>Galician: 339</li>
<li>Hebrew: 255</li>
<li>Romanian: 200</li>
<li>Norwegian Bokmål: 133</li>
<li>Indonesian: 115</li>
<li>Bulgarian: 95</li>
<li>Bengali: 82</li>
<li>Persian: 72</li>
<li>Greek: 66</li>
<li>Esperanto: 59</li>
<li>Slovak: 19</li>
</ul>
</details>
## Contact
- Discord [Open Assistant Discord Server](https://ykilcher.com/open-assistant-discord)
- GitHub: [LAION-AI/Open-Assistant](https://github.com/LAION-AI/Open-Assistant)
- E-Mail: [open-assistant@laion.ai](mailto:open-assistant@laion.ai) |
Slep/LAION-RVS-Fashion | ---
license: mit
language:
- en
tags:
- fashion
- visual search
pretty_name: LAION — Referred Visual Search — Fashion
size_categories:
- 1M<n<10M
---
# **LAION — Referred Visual Search — Fashion**
*Introduced in **Weakly-Supervised Conditional Embedding for Referred Visual Search***
**[CRITEO AI Lab](https://ailab.criteo.com)** x **[ENPC](https://imagine-lab.enpc.fr)**
[Simon Lepage](https://simon-lepage.github.io), Jérémie Mary, [David Picard](https://davidpicard.github.io)
[[`Paper`](https://arxiv.org/abs/2306.02928)]
[[`Demo`](https://huggingface.co/spaces/Slep/CondViT-LRVSF-Demo)]
[[`Code`](https://github.com/Simon-Lepage/CondViT-LRVSF)]
[[`BibTeX`](#citing-the-dataset)]
---
## **Composition**
LAION-RVS-Fashion is composed of images from :
- **[LAION 2B EN](https://huggingface.co/datasets/laion/laion2B-en)**
- **[LAION 2B MULTI TRANSLATED](https://huggingface.co/datasets/laion/laion2B-multi-joined-translated-to-en)**
- **[LAION 1B NOLANG TRANSLATED](https://huggingface.co/datasets/laion/laion1B-nolang-joined-translated-to-en)**
These images have been grouped based on extracted product IDs. Each product in the training set is composed of at least a single image (isolated product), and a complex image (scene). We added categorical metadata and BLIP2 captions to each product. Please see the [samples](#samples) and refer to [our paper](https://arxiv.org/abs/2306.02928) for additional details.
|Split|Products|Distractors|
|-:|:-:|:-:|
|Train|272,457|-|
|Valid|400|99,541|
|Test|2,000|2,000,014|
**Total number of training images :** 841,718.
## **Samples**
<table style='text-align:center'>
<tbody>
<tr>
<td></td>
<td><img src="https://huggingface.co/datasets/Slep/LAION-RVS-Fashion/resolve/main/assets/97969.0.jpg" style="height:200px"></td>
<td><img src="https://huggingface.co/datasets/Slep/LAION-RVS-Fashion/resolve/main/assets/97969.1.jpg" style="height:200px"></td>
<td><img src="https://huggingface.co/datasets/Slep/LAION-RVS-Fashion/resolve/main/assets/219924.0.jpg" style="height:200px"></td>
<td><img src="https://huggingface.co/datasets/Slep/LAION-RVS-Fashion/resolve/main/assets/219924.1.jpg" style="height:200px"></td>
</tr>
<tr>
<td><b>Categories</b></td>
<td colspan=2>Neck</td>
<td colspan=2>Lower Body</td>
</tr>
<tr>
<td><b>BLIP2 Captions</b></td>
<td colspan=2>a scarf with multi-coloured stripes</td>
<td colspan=2>stella pants - dark suede</td>
</tr>
<tr></tr>
<tr>
<td></td>
<td><img src="https://huggingface.co/datasets/Slep/LAION-RVS-Fashion/resolve/main/assets/72317.0.jpg" style="height:200px"></td>
<td><img src="https://huggingface.co/datasets/Slep/LAION-RVS-Fashion/resolve/main/assets/72317.1.jpg" style="height:200px"></td>
<td><img src="https://huggingface.co/datasets/Slep/LAION-RVS-Fashion/resolve/main/assets/108856.0.jpg" style="height:200px"></td>
<td><img src="https://huggingface.co/datasets/Slep/LAION-RVS-Fashion/resolve/main/assets/108856.1.jpg" style="height:200px"></td>
</tr>
<tr>
<td><b>Categories</b></td>
<td colspan=2>Feet</td>
<td colspan=2>Bags</td>
</tr>
<tr>
<td><b>BLIP2 Captions</b></td>
<td colspan=2>neon green patent leather heels with studs</td>
<td colspan=2>the burberry small leather bag is brown and leather</td>
</tr>
</tbody>
</table>
## **Attributes**
- **URL**, **WIDTH**, **HEIGHT**, **punsafe**, **pwatermark**, **language**: Original LAION fields. Please refer to their repository.
- **TEXT**: Text originally associated with the image.
- **ENG_TEXT** : Translated version for MULTI/NOLANG, copy of TEXT for EN.
- **TYPE**: SIMPLE (isolated products), COMPLEX (scenes), PARTIAL_COMPLEX (zommed-in scenes)
- **PRODUCT_ID**: Product identifier, allows to group together images depicting the same product.
- **INDEX_SRC**: ID of parquet file originally storing this image.
- **CATEGORY**: Categories of the products - `Bags, Feet, Hands, Head, Lower Body, Neck, Outwear, Upper Body, Waist, Whole Body` for the products, and `NonClothing` for some distractors.
- **blip2_caption1, blip2_caption2**: [BLIP2-FlanT5XL](https://huggingface.co/Salesforce/blip2-flan-t5-xl)-generated captions.
We also release `bootstrap_IDs.pkl`, the file used to generate the bootstrapped results of the paper. `test_subsets` is composed of [product IDs](https://github.com/Simon-Lepage/CondViT-LRVSF/blob/b660d82b5775de417ba81ac846b6df004b31eb75/lrvsf/test/metrics.py#L229), while `dist_{N}_subsets` are [row indices](https://github.com/Simon-Lepage/CondViT-LRVSF/blob/b660d82b5775de417ba81ac846b6df004b31eb75/lrvsf/test/metrics.py#L248).
---
## Citing the dataset
To cite our work, please use the following BibTeX entry :
```
@article{lepage2023condvit,
title={Weakly-Supervised Conditional Embedding for Referred Visual Search},
author={Lepage, Simon and Mary, Jérémie and Picard, David},
journal={arXiv:2306.02928},
year={2023}
}
``` |
HiTZ/EusExams | ---
license: cc-by-sa-4.0
task_categories:
- question-answering
- multiple-choice
language:
- eu
- es
tags:
- legal
pretty_name: EusExams
size_categories:
- 10K<n<100K
configs:
- config_name: eu_opeosakiadmineu
data_files:
- split: test
path: "data/eu/opeosaki/opeosakiadmineu.jsonl"
- config_name: eu_opeosakiauxenfeu
data_files:
- split: test
path: "data/eu/opeosaki/opeosakiauxenfeu.jsonl"
- config_name: eu_opeosakiauxeu
data_files:
- split: test
path: "data/eu/opeosaki/opeosakiauxeu.jsonl"
- config_name: eu_opeosakiceladoreu
data_files:
- split: test
path: "data/eu/opeosaki/opeosakiceladoreu.jsonl"
- config_name: eu_opeosakienfeu
data_files:
- split: test
path: "data/eu/opeosaki/opeosakienfeu.jsonl"
- config_name: eu_opeosakioperarioeu
data_files:
- split: test
path: "data/eu/opeosaki/opeosakioperarioeu.jsonl"
- config_name: eu_opeosakitecnicoeu
data_files:
- split: test
path: "data/eu/opeosaki/opeosakitecnicoeu.jsonl"
- config_name: eu_opeosakivarioseu
data_files:
- split: test
path: "data/eu/opeosaki/opeosakivarioseu.jsonl"
- config_name: eu_opegasteizkoudala
data_files:
- split: test
path: "data/eu/opegasteiz/opegasteizkoudala.jsonl"
- config_name: eu_opeehuadmineu
data_files:
- split: test
path: "data/eu/opeehu/opeehuadmineu.jsonl"
- config_name: eu_opeehuauxeu
data_files:
- split: test
path: "data/eu/opeehu/opeehuauxeu.jsonl"
- config_name: eu_opeehubiblioeu
data_files:
- split: test
path: "data/eu/opeehu/opeehubiblioeu.jsonl"
- config_name: eu_opeehuderechoeu
data_files:
- split: test
path: "data/eu/opeehu/opeehuderechoeu.jsonl"
- config_name: eu_opeehueconomicaseu
data_files:
- split: test
path: "data/eu/opeehu/opeehueconomicaseu.jsonl"
- config_name: eu_opeehuempresarialeseu
data_files:
- split: test
path: "data/eu/opeehu/opeehuempresarialeseu.jsonl"
- config_name: eu_opeehusubalternoeu
data_files:
- split: test
path: "data/eu/opeehu/opeehusubalternoeu.jsonl"
- config_name: eu_opeehutecnicoeu
data_files:
- split: test
path: "data/eu/opeehu/opeehutecnicoeu.jsonl"
- config_name: eu_opeehuteknikarib
data_files:
- split: test
path: "data/eu/opeehu/opeehuteknikarib.jsonl"
- config_name: eu_ejadministrari
data_files:
- split: test
path: "data/eu/ope/ejadministrari.jsonl"
- config_name: eu_ejlaguntza
data_files:
- split: test
path: "data/eu/ope/ejlaguntza.jsonl"
- config_name: eu_ejlaguntzaile
data_files:
- split: test
path: "data/eu/ope/ejlaguntzaile.jsonl"
- config_name: eu_ejteknikari
data_files:
- split: test
path: "data/eu/ope/ejteknikari.jsonl"
- config_name: eu_osakidetza1e
data_files:
- split: test
path: "data/eu/osakidetza/osakidetza1e.jsonl"
- config_name: eu_osakidetza2e
data_files:
- split: test
path: "data/eu/osakidetza/osakidetza2e.jsonl"
- config_name: eu_osakidetza3e
data_files:
- split: test
path: "data/eu/osakidetza/osakidetza3e.jsonl"
- config_name: eu_osakidetza5e
data_files:
- split: test
path: "data/eu/osakidetza/osakidetza5e.jsonl"
- config_name: eu_osakidetza6e
data_files:
- split: test
path: "data/eu/osakidetza/osakidetza6e.jsonl"
- config_name: eu_osakidetza7e
data_files:
- split: test
path: "data/eu/osakidetza/osakidetza7e.jsonl"
- config_name: eu_opebilbaoeu
data_files:
- split: test
path: "data/eu/opebilbao/opebilbaoeu.jsonl"
- config_name: es_opeosakiadmin
data_files:
- split: test
path: "data/es/opeosaki/opeosakiadmin.jsonl"
- config_name: es_opeosakiaux
data_files:
- split: test
path: "data/es/opeosaki/opeosakiaux.jsonl"
- config_name: es_opeosakiauxenf
data_files:
- split: test
path: "data/es/opeosaki/opeosakiauxenf.jsonl"
- config_name: es_opeosakicelador
data_files:
- split: test
path: "data/es/opeosaki/opeosakicelador.jsonl"
- config_name: es_opeosakienf
data_files:
- split: test
path: "data/es/opeosaki/opeosakienf.jsonl"
- config_name: es_opeosakijuridico
data_files:
- split: test
path: "data/es/opeosaki/opeosakijuridico.jsonl"
- config_name: es_opeosakioperario
data_files:
- split: test
path: "data/es/opeosaki/opeosakioperario.jsonl"
- config_name: es_opeosakitecnico
data_files:
- split: test
path: "data/es/opeosaki/opeosakitecnico.jsonl"
- config_name: es_opeosakivarios
data_files:
- split: test
path: "data/es/opeosaki/opeosakivarios.jsonl"
- config_name: es_opeayuntamientovitoria
data_files:
- split: test
path: "data/es/opegasteiz/opeayuntamientovitoria.jsonl"
- config_name: es_opeehuadmin
data_files:
- split: test
path: "data/es/opeehu/opeehuadmin.jsonl"
- config_name: es_opeehuaux
data_files:
- split: test
path: "data/es/opeehu/opeehuaux.jsonl"
- config_name: es_opeehubiblio
data_files:
- split: test
path: "data/es/opeehu/opeehubiblio.jsonl"
- config_name: es_opeehuderecho
data_files:
- split: test
path: "data/es/opeehu/opeehuderecho.jsonl"
- config_name: es_opeehueconomicas
data_files:
- split: test
path: "data/es/opeehu/opeehueconomicas.jsonl"
- config_name: es_opeehuempresariales
data_files:
- split: test
path: "data/es/opeehu/opeehuempresariales.jsonl"
- config_name: es_opeehusubalterno
data_files:
- split: test
path: "data/es/opeehu/opeehusubalterno.jsonl"
- config_name: es_opeehutecnico
data_files:
- split: test
path: "data/es/opeehu/opeehutecnico.jsonl"
- config_name: es_opeehutecnicob
data_files:
- split: test
path: "data/es/opeehu/opeehutecnicob.jsonl"
- config_name: es_ejadministrativo
data_files:
- split: test
path: "data/es/ope/ejadministrativo.jsonl"
- config_name: es_ejauxiliar
data_files:
- split: test
path: "data/es/ope/ejauxiliar.jsonl"
- config_name: es_ejsubalterno
data_files:
- split: test
path: "data/es/ope/ejsubalterno.jsonl"
- config_name: es_ejtecnico
data_files:
- split: test
path: "data/es/ope/ejtecnico.jsonl"
- config_name: es_osakidetza1c
data_files:
- split: test
path: "data/es/osakidetza/osakidetza1c.jsonl"
- config_name: es_osakidetza2c
data_files:
- split: test
path: "data/es/osakidetza/osakidetza2c.jsonl"
- config_name: es_osakidetza3c
data_files:
- split: test
path: "data/es/osakidetza/osakidetza3c.jsonl"
- config_name: es_osakidetza4c
data_files:
- split: test
path: "data/es/osakidetza/osakidetza4c.jsonl"
- config_name: es_osakidetza5c
data_files:
- split: test
path: "data/es/osakidetza/osakidetza5c.jsonl"
- config_name: es_osakidetza6c
data_files:
- split: test
path: "data/es/osakidetza/osakidetza6c.jsonl"
- config_name: es_osakidetza7c
data_files:
- split: test
path: "data/es/osakidetza/osakidetza7c.jsonl"
- config_name: es_osakidetza8c
data_files:
- split: test
path: "data/es/osakidetza/osakidetza8c.jsonl"
- config_name: es_osakidetza9c
data_files:
- split: test
path: "data/es/osakidetza/osakidetza9c.jsonl"
- config_name: es_opebilbao
data_files:
- split: test
path: "data/es/opebilbao/opebilbao.jsonl"
---
# Dataset Card for EusExams
EusExams is a collection of tests designed to prepare individuals for Public Service examinations conducted by several Basque institutions, including the public health system Osakidetza, the Basque Government, the City Councils of Bilbao and Gasteiz, and the University of the Basque Country (UPV/EHU). Within each of these groups, there are different exams for public positions, such as administrative and assistant roles. Each multiple-choice question contains 2 to 4 choices (3.90 on average) and one correct answer. The dataset is mostly parallel with 16k questions in Basque and 18k in Spanish.
- **Curated by:** HiTZ Research Center & IXA Research group (University of the Basque Country UPV/EHU)
- **Language(s) (NLP):** Basque (eu)
- 📒 Blog Post: [Latxa: An Open Language Model and Evaluation Suite for Basque](https://www.hitz.eus/en/node/340)
- 📖 Paper: [Latxa: An Open Language Model and Evaluation Suite for Basque](https://arxiv.org/abs/2403.20266)
- 💻 Code: [hitz-zentroa/latxa](https://github.com/hitz-zentroa/latxa)
- 📧 Contact: [hitz@ehu.eus](mailto:hitz@ehu.eus)
## Example
Basque Example:
```txt
Galdera: UPV/EHUREN ONDAREA HAU DA:
A. UPV/EHUk jabetzan dituen ondasunak.
B. UPV/EHUk jabetzan dituen ondasun eta eskubideak.
C. UPV/EHUk jabetzan edo titularitatean dituen ondasun eta eskubideak, bai eta etorkizunean eskuratzen edo esleitzen zaizkion gainerako guztiak ere.
D. UPV/EHUk jabetzan dituen ondasunak, bai eta etorkizunean eskuratzen dituen gainerako guztiak ere.
Erantzuna: C
```
English Translation:
```txt
Question: UPV/EHU’S LEGACY IS:
A. The property owned by UPV/EHU.
B. The rights and property owned by the UPV/EHU.
C. The rights and property of the UPV/EHU in ownership, as well as any other property acquired or assigned to it in the future.
D. The property of the UPV/EHU in ownership, as well as any other property acquired or assigned to it in the future.
Answer: C
```
## Citation
```bibtex
@misc{etxaniz2024latxa,
title={{L}atxa: An Open Language Model and Evaluation Suite for {B}asque},
author={Julen Etxaniz and Oscar Sainz and Naiara Perez and Itziar Aldabe and German Rigau and Eneko Agirre and Aitor Ormazabal and Mikel Artetxe and Aitor Soroa},
year={2024},
eprint={2403.20266},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
snipaid/instruct-snippet-mlsum | ---
license: mit
language: de
tags:
- news
- headline generation
- teaser generation
- keyword generation
- tweet generation
- serp title-tag generation
- serp meta-description generation
- news snippet generation
size_categories:
- 1K<n<10K
task_categories:
- summarization
- text2text-generation
pretty_name: Instruct-Snippet-MLSUM-500
---
# Dataset Card for Instruct-Snippet-MLSUM-500
### Dataset Summary
This is a dataset for multitask instruction finetuning dataset for the task of news snippet generation. It is built from a sample of ~500 news articles from the [MLSUM](https://huggingface.co/datasets/mlsum) dataset, augmented with machine generated news snippets.
### Supported Tasks
This dataset was created to support the task of generating news snippets such as title, teaser, keywords, serp and tweet for news articles in German language.
### Languages
de - German
## Dataset Structure
lable: a string feature.
instruction: a string feature.
input: a string feature.
output: a string feature.
## Dataset Creation
This dataset was created from Snippet-MLSUM-500. See [Snippet-MLSUM-500](https://huggingface.co/datasets/snipaid/snippet-mlsum-500) for the dataset without instructions.
Instructions were generated with GPT-3.5 from a human-curated seed-set of instructions.
## Considerations for Using the Data
### Known Limitations
Part of the snippet data is machine generated. Be aware that these features (specifically: output) may exhibit signs of model hallucination, toxicity and stereotypes.
## Additional Information
See [Instruct-Snippet-MLSUM-500-V2](https://huggingface.co/datasets/snipaid/instruct-snippet-mlsum-500-v2) if you are interested in an improved successor, with further support for summaries.
### Licensing Information
This dataset is licensed under MIT license. |
huggingartists/adele | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/adele"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.304292 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/45ccf22bba4c1f80989e645c2fd4ec44.1000x1000x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/adele">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Adele</div>
<a href="https://genius.com/artists/adele">
<div style="text-align: center; font-size: 14px;">@adele</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/adele).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/adele")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|203| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/adele")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
projecte-aina/CATalog | ---
annotations_creators:
- machine-generated
language_creators:
- found
language:
- ca
license:
- cc-by-nc-nd-4.0
multilinguality:
- monolingual
size_categories:
- 10B<n<100B
source_datasets:
- extended|mc4
- extended|oscar
- extended|cawac
task_categories:
- fill-mask
- text-generation
task_ids:
- masked-language-modeling
- slot-filling
- language-modeling
pretty_name: CATalog
tags: []
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: score
dtype: float64
- name: strategy
dtype: string
- name: languages
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 115827685843
num_examples: 34314510
download_size: 31532509161
dataset_size: 115827685843
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
## Dataset Description
- **Homepage** [Projecte AINA](https://projecteaina.cat/tech/)
- **Repository** [HuggingFace](https://huggingface.co/projecte-aina)
- **Paper** ["A CURATEd CATalog: Rethinking the Extraction of Pretraining Corpora for Mid-Resourced Languages"]()
- **Leaderboard** N/A
- **Point of Contact** langtech@bsc.es
### Dataset Summary
CATalog is a diverse, open-source Catalan corpus for language modelling. It consists of text documents from 26 different sources, including web crawling, news, forums, digital libraries and public institutions, totaling in 17.45 billion words.
### Supported Tasks and Leaderboards
- `Fill-Mask`
- `Text Generation`
- `other:Language-Modelling`: The dataset is suitable for training a model in Language Modelling, predicting the next word in a given context. Success is measured by achieving a low [Perplexity](https://huggingface.co/spaces/evaluate-metric/perplexity) score, indicating the model's proficiency in accurately predicting subsequent words.
- `other:Masked-Language-Modelling`: The dataset is designed for training models in Masked Language Modelling. This task involves predicting masked or hidden words within a sentence. Success is typically measured by achieving a high performance score, such as accuracy or [F1](https://huggingface.co/spaces/evaluate-metric/f1) score, on correctly predicting the masked tokens.
### Languages
This dataset is in Catalan (ca-ES). Coming from the web, some documents may contain other languages.
## Dataset Structure
### Data Instances
The dataset is provided in a JSONL format, where each row corresponds to a single document and contains a document identifier, the text, a quality score, the strategy used to evaluate the document quality, languages, and a URL of the document, if available.
```
{
"id": "macocu_ca_20230731_9_402472",
"text": "Jaume Casañas relleva Dolors Carreras a l’Alcaldia de l’Ajuntament de Cunit.
La substitució prevista al pacte de govern del 2019 s’ha materialitzat aquest
dissabte al matí. Aquest dissabte al matí, en un acte al Casal Municipal de
Cunit, s’ha celebrat l’acte de relleu de l’Alcaldia de l’Ajuntament de Cunit,
segons preveia el pacte de govern signat el juny del 2019 pels grups del PSC,
encapçalat per la fins ara alcaldessa, Dolors Carreras, i Impulsem Cunit, amb
el ja nou alcalde, Jaume Casañas, al capdavant.",
"score": 0.8105327621841463,
"strategy": "curate",
"languages": "{"ca": 1.0}",
"url": ""
}
```
### Data Fields
- `id`: text string containing the document identifier. Consists of the subdataset code, the part number and a document number.
- `text`: text string from the document, with paragraphs separated by two newlines escape sequences. It is meant to be used directly as input for language modelling.
- `score`: positive float number representing the document quality, ranging from 0, which represents the worst quality, to 1, the best quality.
- `strategy`: text string describing the type of evaluation applied to obtain the document score. "curate" uses the heuristic evaluation from [CURATE](https://github.com/langtech-bsc/corpus-cleaner-v2) and "perfect" means that manual review was done and the highest score (1) is applied.
- `languages`: dictionary containing the document languages, with a percentage indicating the character ratio for each one.
- `url`: text string with the URL of the document, if available.
### Data Splits
We do not provide any canonical splits for CATalog.
## Dataset Creation
### Curation Rationale
CATalog is mainly built on filtered, non-overlapping versions of [CommonCrawl](https://commoncrawl.org/) snapshots and a smaller set of manually selected corpora from specific sources. We use the [CURATE](https://github.com/langtech-bsc/corpus-cleaner-v2) pipeline, which combines exact deduplication, language identification, and scoring heuristics.
In the design of CATalog, we adhere to the following values:
- (1) **Scale & Flexibility**. We intend to produce datasets that have a significant impact on the training of multilingual models in the range of 7B-180B parameters. Since Catalan is a medium-resource language and data acquisition is already a challenge, binary filtering will limit us in terms of the amount of data. By providing a score, we are able to easily filter the corpus according to any requirement.
- (2) **Neutral scoring**. As opposed to ML-based filtering, we use simple rules and heuristics to avoid introducing further bias into the model ([Dodge et al., 2021](https://arxiv.org/abs/2104.08758); [Welbl et al., 2021](https://arxiv.org/abs/2109.07445)). We only use [FastText](https://fasttext.cc/docs/en/language-identification.html) to reject documents in other languages.
During development, we performed comparative judgment experiments to evaluate the usefulness of the scoring from the [CURATE](https://github.com/langtech-bsc/corpus-cleaner-v2) pipeline, which is intended for further filtering and analysis. We found a moderate correlation between the score and the perceived quality of the text. Our main goal was to maximize the usability of the corpus without getting into a trade-off between quantity and quality.
### Source Data
#### Initial Data Collection and Normalization
We applied extensive data processing using our [CURATE](https://github.com/langtech-bsc/corpus-cleaner-v2) pipeline.
We first filter documents by their language content using [FastText](https://fasttext.cc/docs/en/language-identification.html). Only documents with at least 50% of characters in Catalan are kept. We then perform exact document deduplication. After this stage, we score each document with a tested set of 8 heuristic evaluators, inspired from other web filterings and from our own creation.
The following pre-existing datasets were used:
- [`OSCAR-2301`](https://huggingface.co/datasets/oscar-corpus/OSCAR-2301)
- [`OSCAR-2201`](https://huggingface.co/datasets/oscar-corpus/OSCAR-2201)
- [`CaText`](https://zenodo.org/records/5483031)
- [`MaCoCu-ca 1.0`](http://hdl.handle.net/11356/1837)
- [`caWaC`](https://huggingface.co/datasets/cawac)
- [`Colossal OSCAR 1.0`](https://huggingface.co/datasets/oscar-corpus/colossal-oscar-1.0)
- [`mC4`]({https://huggingface.co/datasets/mc4)
#### Who are the source language producers?
Apart from the pre-existing datasets, all of them coming from [CommonCrawl](https://commoncrawl.org/) dumps, the following
sources provided their data on Open Data Agreements:
- ## Media Groups
- [`IB3`](https://ib3.org/)
- [`Grup El Món`](https://grupmon.cat/)
- [`Vilaweb`](https://www.vilaweb.cat/)
- [`Nació Digital`](https://www.naciodigital.cat/)
- [`ACN`](https://www.acn.cat/)
- [`Racó Català Articles`](https://www.racocatala.cat/)
- [`Racó Català Fòrums (anonymized version)`](https://huggingface.co/datasets/projecte-aina/raco_forums)
- [`Aquí Berguedà`](https://www.aquibergueda.cat/)
- ## Academic & Book Repositories
- [`Tesis Doctorals en Xarxa (TDX)`](https://www.tesisenred.net/)
- [`Wikipedia`](https://ca.wikipedia.org/)
- [`Project Gutenberg`](https://www.gutenberg.org/)
- ## Government Institutions
- [`Parlament de Catalunya`](https://www.parlament.cat/web/index.html)
- [`Les Corts Valencianes`](https://www.cortsvalencianes.es/)
- [`Diari Oficial de la Generalitat Valenciana`](https://dogv.gva.es/)
- [`Butlletí Oficial de la Universitat d'Alacant`](https://www.boua.ua.es/)
### Annotations
The score is an automatic label obtained from the aggregation of different heuristic evaluators based on predefined thresholds. Specific evaluators penalize documents for factors like minimum word count, average word per sentence, punctuation per word rate, unique sentences ratio, stopword ratio, Brunet index, language diversity, and content identified by regular expressions, providing a comprehensive approach to document scoring.
#### Annotation process
The process involves assigning scores between 0 and 1 to sentences, paragraphs, and documents in a hierarchical manner. Individual evaluators at different levels contribute scores that are combined using geometric means, emphasizing a probability-like interpretation to encourage evaluators to assess desirability. The final document score is derived through analogous aggregation of paragraph and document scores, distinct from a linear model.
#### Who are the annotators?
[N/A]
### Personal and Sensitive Information
Being partially constructed from Common Crawl, personal and sensitive information might be present.
This must be considered before training deep learning models with CATalog, specially in the case of text-generation models.
## Considerations for Using the Data
### Social Impact of Dataset
CATalog promotes the Catalan language in the NLP field, enabling development of advanced applications and chatbots tailored to Catalan speakers, while improving access to information for better community understanding. However, most of the sources in the dataset are web-scraped, which may bring in biases and privacy issues, risking flawed outcomes and potential misuse.
Given that Catalan is a mid-resourced language with low representation in digital sources, this dataset becomes crucial for building inclusive NLP applications. It addresses the language's underrepresentation, empowering the Catalan community with improved access to text resources in their native language. However, careful consideration of potential biases and privacy issues is essential to ensure responsible and equitable technology use.
### Discussion of Biases
Web-crawled content is over-represented with standard language varieties, impacting language model performance for minority languages. Language diversity in data is crucial to avoid bias, especially in encoding non-standard dialects, preventing the exclusion of demographic groups. Our corpus primarily focuses on Central Catalan, but we actively include Valencian and Balearic Catalan, along with diverse sociolects from platforms like Racó Català Fòrums, aiming for a more representative dataset. Despite legal uncertainties in web-scraped data, we prioritize permissive licenses and privacy protection measures, acknowledging the challenges posed by personally identifiable information (PII) within large-scale datasets. Our ongoing efforts aim to address privacy concerns and contribute to a more inclusive linguistic dataset.
### Other Known Limitations
[N/A]
## Additional Information
### Dataset Curators
Language Technologies Unit (langtech@bsc.es) at the Barcelona Supercomputing Center (BSC).
### Funding
This work has been promoted and financed by the Generalitat de Catalunya through the [Aina project](https://projecteaina.cat/).
### Licensing Information
CATalog is a collection of text documents from sources with various licenses. The whole work is licensed under the most restrictive license in the corpus, which is [Creative Commons Attribution-NonCommercial-NoDerivs 4.0 International](https://creativecommons.org/licenses/by-nc-nd/4.0/deed.es) license. Any use of all or part of the text gathered in CATalog must abide by the terms of the original licenses, including attribution clauses when relevant. We facilitate this by providing provenance information for each data point.
The list of [SPDX license identifiers](https://spdx.org/licenses/) included in the documentation can be found in the following table or in this [JSON file](https://huggingface.co/datasets/projecte-aina/CATalog/blob/main/licenses.json).
| Source | Identifier | License |
| ----------------------- | ----------------------------------- | ------------------------- |
| Tesis Doctorales en Xarxa (TDX) | tdx_ca_20220518 | [CC-BY-4.0](https://creativecommons.org/licenses/by/4.0/legalcode) |
| Wikipedia | wikipedia_ca_20230401 | [CC-BY-SA-4.0](https://creativecommons.org/licenses/by-sa/4.0/legalcode) |
| IB3 | crawling-ib3_ca_20230205 | Data Sharing Agreement\* |
| Les Corts Valencianes | les-corts-valencianes_ca_20230704 | Data Sharing Agreement\* |
| Grup El Món | grup-elmon_ca_20230726 | Data Sharing Agreement\* |
| Vilaweb | vilaweb_ca_20220728 | Data Sharing Agreement\* |
| Nació Digital | naciodigital_ca_20220331 | [CC-BY-NC-ND-4.0](https://creativecommons.org/licenses/by-nc-nd/4.0/legalcode) |
| ACN | acn_ca_20201011 | Data Sharing Agreement\* |
| Racó Català Articles | racoarticles_ca_20221005 | Data Sharing Agreement\* |
| Racó Català Fòrums | racoforumsanon_ca_20211213 | Data Sharing Agreement\* |
| Wikimedia | wikimedia_ca_20230829 | [CC-BY-SA-4.0](https://creativecommons.org/licenses/by-sa/4.0/legalcode) |
| Project Gutenberg | gutenberg_ca_20220224 | [Project Gutenberg ToU](https://www.gutenberg.org/policy/terms_of_use.html) |
| DOGC | dogc_ca_20230901 | Data Sharing Agreement\* |
| DOGV | dogv_ca_20231006 | Data Sharing Agreement\* |
| BOUA | boua_ca_20231006 | Data Sharing Agreement\* |
| Aquí Berguedà | aquibergueda_ca_20231009 | Data Sharing Agreement\* |
| Parlament de Catalunya | parlament_ca_20232009 | Data Sharing Agreement\* |
| CaWac | cawac_ca_20200528 | [CC-BY-SA-3.0](https://creativecommons.org/licenses/by-sa/3.0/legalcode) |
| MaCoCu | macocu_ca_20230731 | [CC-BY-4.0](https://creativecommons.org/licenses/by/4.0/legalcode) |
| Crawling populars | crawling-populars_ca_20200525 | [CC0-1.0](https://creativecommons.org/publicdomain/zero/1.0/legalcode) |
| Colossal OSCAR 1 (03-04-23) | colossal-oscar-03-04-23_ca_20230829 | [CC0-1.0](https://creativecommons.org/publicdomain/zero/1.0/legalcode) |
| Colossal OSCAR 1 (05-06-23) | colossal-oscar-05-06-23_ca_20230829 | [CC0-1.0](https://creativecommons.org/publicdomain/zero/1.0/legalcode) |
| Colossal OSCAR 1 (2022-27) | colossal-oscar-2022-27_ca_20231005 | [CC0-1.0](https://creativecommons.org/publicdomain/zero/1.0/legalcode) |
| OSCAR-2201 | oscar-2201_ca_20230904 | [CC0-1.0](https://creativecommons.org/publicdomain/zero/1.0/legalcode) |
| OSCAR-2301 | oscar-2301_ca_20230418 | [CC0-1.0](https://creativecommons.org/publicdomain/zero/1.0/legalcode) |
| mC4 | mc4_ca_20230418 | [CC-BY-4.0](https://creativecommons.org/licenses/by/4.0/legalcode) |
\* The data from each entity is governed by a distinct Data Sharing Agreement. All data provided by these entities is open and freely distributable.
### Citation Information
[N/A]
### Contributions
We thank the VIVES Plan for language technologies of the Valencian community, https://vives.gplsi.es/, from the CENID Digital Intelligence Center of the University of Alicante and the [DFKI](https://www.dfki.de/web) for their collaboration and contribution.
This work was funded by Departament de la Vicepresidència i de Polítiques Digitals i Territori de la Generalitat de Catalunya within the framework of Projecte AINA.
|
iamnguyen/ds_by_sys_prompt_4 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 55803207.524942234
num_examples: 32718
download_size: 24839940
dataset_size: 55803207.524942234
---
# Dataset Card for "ds_by_sys_prompt_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ReligiousLLMs/quran_ayats_with_shakl | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1312907
num_examples: 6236
download_size: 580359
dataset_size: 1312907
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
derexHf/MathInstruct2000 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 120877641
num_examples: 161289
download_size: 60482176
dataset_size: 120877641
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
halilbabacan/autotrain-data-cognitive_distortion_gpt_roberta | ---
task_categories:
- text-classification
---
# AutoTrain Dataset for project: cognitive_distortion_gpt_roberta
## Dataset Description
This dataset has been automatically processed by AutoTrain for project cognitive_distortion_gpt_roberta.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "From an 80 year old woman in the US: Life will be going normalSomething will happen that I don\u2019t like or disagree with nothing serious All of a sudden I feel pressure in my head maybe like it will burstI never know when it might happen and there is no warning I guess it is an anxiety attack",
"target": 1
},
{
"text": " My friend scored higher than me in the test I must be less intelligent.",
"target": 0
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"target": "ClassLabel(names=['Distortion', 'No Distortion'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 1543 |
| valid | 387 |
|
sunhaozhepy/sst_rake_keywords_embeddings | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: float32
- name: tokens
dtype: string
- name: tree
dtype: string
- name: keywords
dtype: string
- name: keywords_embeddings
sequence: float32
splits:
- name: train
num_bytes: 29520504
num_examples: 8544
- name: validation
num_bytes: 3807534
num_examples: 1101
- name: test
num_bytes: 7636757
num_examples: 2210
download_size: 47216489
dataset_size: 40964795
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
316usman/thematic3bembed | ---
license: bsd
dataset_info:
features:
- name: text
dtype: string
- name: thematic
dtype: string
- name: sub-thematic
dtype: string
- name: country
dtype: string
- name: document_url
dtype: string
- name: source_url
dtype: string
splits:
- name: train
num_bytes: 98024603
num_examples: 127686
download_size: 27749456
dataset_size: 98024603
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hippocrates/Casereport_gpt_train | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 84770635
num_examples: 18068
- name: valid
num_bytes: 84770635
num_examples: 18068
- name: test
num_bytes: 84770635
num_examples: 18068
download_size: 134231586
dataset_size: 254311905
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
---
|
lipaoMai/drug_one_1dataset | ---
dataset_info:
features:
- name: patient_id
dtype: int64
- name: drugName
dtype: string
- name: condition
dtype: string
- name: review
dtype: string
- name: rating
dtype: float64
- name: date
dtype: string
- name: usefulCount
dtype: int64
splits:
- name: test
num_bytes: 28367208
num_examples: 53471
- name: train
num_bytes: 85172055
num_examples: 160398
download_size: 63481104
dataset_size: 113539263
---
# Dataset Card for "drug_one_1dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-imdb-plain_text-fdc5b9-67091145591 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- imdb
eval_info:
task: summarization
model: t5-small
metrics: []
dataset_name: imdb
dataset_config: plain_text
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: t5-small
* Dataset: imdb
* Config: plain_text
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@michaeldesmond](https://huggingface.co/michaeldesmond) for evaluating this model. |
bigbio/spl_adr_200db |
---
language:
- en
bigbio_language:
- English
license: cc0-1.0
multilinguality: monolingual
bigbio_license_shortname: CC0_1p0
pretty_name: SPL ADR
homepage: https://bionlp.nlm.nih.gov/tac2017adversereactions/
bigbio_pubmed: False
bigbio_public: True
bigbio_tasks:
- NAMED_ENTITY_RECOGNITION
- NAMED_ENTITY_DISAMBIGUATION
- RELATION_EXTRACTION
---
# Dataset Card for SPL ADR
## Dataset Description
- **Homepage:** https://bionlp.nlm.nih.gov/tac2017adversereactions/
- **Pubmed:** False
- **Public:** True
- **Tasks:** NER,NED,RE
The United States Food and Drug Administration (FDA) partnered with the National Library
of Medicine to create a pilot dataset containing standardised information about known
adverse reactions for 200 FDA-approved drugs. The Structured Product Labels (SPLs),
the documents FDA uses to exchange information about drugs and other products, were
manually annotated for adverse reactions at the mention level to facilitate development
and evaluation of text mining tools for extraction of ADRs from all SPLs. The ADRs were
then normalised to the Unified Medical Language System (UMLS) and to the Medical
Dictionary for Regulatory Activities (MedDRA).
## Citation Information
```
@article{demner2018dataset,
author = {Demner-Fushman, Dina and Shooshan, Sonya and Rodriguez, Laritza and Aronson,
Alan and Lang, Francois and Rogers, Willie and Roberts, Kirk and Tonning, Joseph},
title = {A dataset of 200 structured product labels annotated for adverse drug reactions},
journal = {Scientific Data},
volume = {5},
year = {2018},
month = {01},
pages = {180001},
url = {
https://www.researchgate.net/publication/322810855_A_dataset_of_200_structured_product_labels_annotated_for_adverse_drug_reactions
},
doi = {10.1038/sdata.2018.1}
}
```
|
Johnnyeee/Yelpdata_663 | ---
configs:
- config_name: yelpreview
data_files:
- split: train
path: yelptrain_data.parquet
- split: test
path: yelptest_data.parquet
task_categories:
- summarization
language:
- en
size_categories:
- 100B<n<1T
tags:
- yelp review
- restaurant review
---
# Dataset Card for Yelp Resturant Dataset
## Dataset Description
### Dataset Access
- [Yelp Raw Data Download Link](https://www.yelp.com/dataset/download)
### Raw Dataset Summary
Yelp raw data encompasses a wealth of information from the Yelp platform, detailing user reviews, business ratings, and operational specifics across a diverse array of local establishments.
To be more specific, yelp raw dataset contains **five different JSON datasets**:
- `yelp_academic_dataset_business.json` (118.9MB)
This file contains information about businesses listed on Yelp. Each record in this dataset typically includes the business's name, address, city, state, postal code, latitude and longitude, stars (average rating), review count, categories (e.g., Restaurants, Shopping, etc.), and other attributes like parking availability or if it's wheelchair accessible.
- `yelp_academic_dataset_checkin.json` (287MB)
The checkin file provides data on check-ins at businesses by users over time. It includes the business ID and a series of timestamps indicating when users checked in at that location, offering insights into the popularity of the business at different times and days.
- `yelp_academic_dataset_review.json` (5.34GB)
This dataset contains reviews written by users for businesses. Each review includes the user ID, business ID, stars given (1 to 5), useful/funny/cool votes, the text of the review, and the date it was posted. This data can be used to analyze customer sentiment, evaluate service quality, and more.
- `yelp_academic_dataset_tip.json` (180.6MB)
Tips are short messages left by users about a business, often containing suggestions, compliments, or advice for future customers. This file includes information such as the text of the tip, the date it was left, the business ID, and the user ID. Tips provide quick, insightful feedback about a business.
- `yelp_academic_dataset_user.json` (3.36GB)
This file contains data about Yelp users, including their user ID, name, review count, yelping since (the date they joined Yelp), friends (a list of user IDs representing their friends on Yelp), useful/funny/cool vote counts they've received, fans (the number of users who've marked them as a "fan"), and their average stars given. It can be used to analyze user behavior, social networks, and influence on Yelp.
### Language
The Yelp dataset is primarily composed of English language text for its reviews, business information, and user interactions.
## Dataset Process
In this project, we will try only use
[`yelp_academic_dataset_business.json`](https://yelpdata.s3.us-west-2.amazonaws.com/yelp_academic_dataset_business.json)
and [`yelp_academic_dataset_review.json`](https://yelpdata.s3.us-west-2.amazonaws.com/yelp_academic_dataset_review.json). (You can check the json files by clicking the links.)
And we will focus solely on restaurants, so we will follow these steps to get our target datasets:
- Load `yelp_academic_dataset_business.json` and `yelp_academic_dataset_review.json` as pandas DataFrames.
- Perform an inner merge of these datasets based on `business_id` and filter out businesses that are not restaurants (filter out rows that `categories` doesn't contain "restaurants").
- Split the yelp restaurants dataset into a training dataset and a testing dataset by shuffling the dataset and then spliting it by 80/20.
- Finally, we get yelp restaurants training dataset and testing dataset.
(Other than doing data processing in .py file, I also provide an individual data processing python file. Please feel free to check if you need: [Data Process Colab Link](https://colab.research.google.com/drive/1r_gUGmsawwtFpZCj23X1jWjfEi6Dw291?usp=sharing))
## Restaurant Dataset
### Restaurant Dataset Summary
- `yelptrain_data.parquet`
This dataset provides a detailed overview of businesses, focusing on aspects such as location, ratings, and customer reviews. It contains columns that identify each business, its geographical information, and metrics indicating its performance, such as aggregate ratings and review counts. Additionally, it includes specifics about the types of services and cuisines offered, operational hours, and detailed customer reviews with ratings, usefulness, humor, and coolness indicators, as well as the text content of the reviews and their posting dates. This dataset includes 3,778,658 rows and it is 2.26 GB.
- `yelptest_data.parquet`
This dataset provides the same information as `yelptrain_data.parquet`, but it includes 943,408 rows and it is 591 MB.
### Supposed Tasks
- Sentiment Analysis: By examining the textual reviews, natural language processing can be used to gauge customer sentiment towards businesses, categorizing opinions into positive, negative, or neutral sentiments.
- Rating Prediction: Machine learning models can leverage user and business attributes to predict the potential ratings a business might receive, helping in understanding factors that influence customer satisfaction.
- Business Analytics: Analysis of business performance metrics such as average ratings, review counts, and operational status can inform business owners about their market standing and customer perceptions.
- Recommendation Systems: The data can feed into recommendation algorithms to suggest businesses to users based on their preferences, previous ratings, and similar user behavior.
### Restaurant Dataset Structure
#### Variables
- business_id: A unique identifier for each business listed in the dataset. (non-null, object)
- name: The name of the business. (non-null, object)
- address: The street address of the business. (non-null, object)
- city: The city where the business is located. (non-null, object)
- state: The state or region where the business is located. (non-null, object)
- postal_code: The postal code associated with the business location. (non-null, object)
- latitude: The latitude coordinate of the business for geographical mapping. (non-null, float64)
- longitude: The longitude coordinate of the business for geographical mapping. (non-null, float64)
- stars_x: The average star rating of the business. (non-null, float64)
- review_count: The number of reviews the business has received. (non-null, int64)
- is_open: A binary variable indicating whether the business is open (1) or closed (0). (non-null, int64)
- attributes: A collection of attributes about the business, like 'Accepts Credit Cards', 'Parking', 'Wi-Fi', etc. (with missing values 493 rows if total 200,000 rows, object)
- categories: The categories the business falls under, such as 'Restaurants', 'Food',’Coffee’, etc. (non-null, object)
- hours: The hours of operation for the business. (with missing values 6,905 rows if total 200,000 rows, object)
- review_id: A unique identifier for each review. (non-null, object)
- user_id: A unique identifier for each user who has left a review. (non-null, object)
- stars_y: The star rating given by the user in their review. (non-null, float64)
- useful: The number of users who found the review useful. (non-null, int64)
- funny: The number of users who found the review funny. (non-null, int64)
- cool: The number of users who found the review cool. (non-null, int64)
- text: The text content of the review. (non-null, object)
- date: The date when the review was posted. (non-null, object)
#### Variables Instances
```
{'business_id': 'XQfwVwDr-v0ZS3_CbbE5Xw',
'name': 'Turning Point of North Wales',
'address': '1460 Bethlehem Pike',
'city': 'North Wales',
'state': 'PA',
'postal_code': '19454',
'latitude': 40.21019744873047,
'longitude': -75.22364044189453,
'stars_x': 3.0,
'review_count': 169.0,
'is_open': 1.0,
'categories': 'Restaurants, Breakfast & Brunch, Food, Juice Bars & Smoothies, American (New), Coffee & Tea, Sandwiches',
'hours': '{"Monday": "7:30-15:0", "Tuesday": "7:30-15:0", "Wednesday": "7:30-15:0", "Thursday": "7:30-15:0", "Friday": "7:30-15:0", "Saturday": "7:30-15:0", "Sunday": "7:30-15:0"}',
'review_id': 'KU_O5udG6zpxOg-VcAEodg',
'user_id': 'mh_-eMZ6K5RLWhZyISBhwA',
'stars_y': 3.0,
'useful': 0.0,
'funny': 0.0,
'cool': 0.0,
'text': "If you decide to eat here, just be aware it is going to take about 2 hours from beginning to end. We have tried it multiple times, because I want to like it! I have been to it's other locations in NJ and never had a bad experience. \n\nThe food is good, but it takes a very long time to come out. The waitstaff is very young, but usually pleasant. We have just had too many experiences where we spent way too long waiting. We usually opt for another diner or restaurant on the weekends, in order to be done quicker.",
'date': '2018-07-07 22:09:11',
'attributes': '{"NoiseLevel": "u\'average\'", "HasTV": "False", "RestaurantsAttire": "\'casual\'", "BikeParking": "False", "Ambience": "{\'touristy\': False, \'hipster\': False, \'romantic\': False, \'divey\': False, \'intimate\': False, \'trendy\': False, \'upscale\': False, \'classy\': False, \'casual\': True}", "WiFi": "\'free\'", "DogsAllowed": "False", "Alcohol": "\'none\'", "BusinessAcceptsCreditCards": "True", "RestaurantsGoodForGroups": "True", "RestaurantsPriceRange2": "2", "RestaurantsReservations": "False", "WheelchairAccessible": "True", "BusinessAcceptsBitcoin": "False", "RestaurantsTableService": "True", "GoodForKids": "True", "Caters": "False", "HappyHour": "False", "RestaurantsDelivery": "True", "GoodForMeal": "{\'dessert\': False, \'latenight\': False, \'lunch\': True, \'dinner\': False, \'brunch\': True, \'breakfast\': True}", "OutdoorSeating": "True", "RestaurantsTakeOut": "True", "BusinessParking": "{\'garage\': False, \'street\': False, \'validated\': False, \'lot\': True, \'valet\': False}"}'}
```
### Usage
The dataset is compatible with the Hugging Face `datasets` library. The dataset class `YelpDataset` provides methods to access the structured data efficiently, including features detailing business information, user reviews, and user profiles.
### Getting Started
To start working with the Yelp Dataset in Python, ensure you have the Hugging Face `datasets` library installed. Then, you can load the dataset using the `YelpDataset` class provided in the script. Here's a quick example:
```
from datasets import load_dataset
dataset = load_dataset("Johnnyeee/Yelpdata_663", trust_remote_code=True)
```
This will give you a quick glimpse into the structure and content of the dataset, ready for your analysis or model training tasks.
You can also generate a training dataset example by:
```
next(iter((dataset['train'])))
```
A testing dataset example
```
next(iter((dataset['test'])))
```
You can check this Colab link to find out more details: [Link*](https://colab.research.google.com/drive/1ybXGIYUqJ7DH22A4apynfrWCMGzb2v_T?usp=sharing)
## Dataset Creation
### Curation Rationale
The dataset includes a variety of data types (e.g., business information, reviews, user data, check-ins, and tips), enabling a wide range of research topics and studies in areas such as natural language processing, social network analysis, recommender systems, and geographic information systems.
By providing data from an active and popular platform, the dataset offers insights into real-world consumer behavior, business trends, and social interactions. This relevance makes it an excellent resource for studies aiming to understand or model aspects of the contemporary economy and society.
By making the dataset publicly available for academic and educational purposes, Yelp aims to contribute to the broader academic community. It lowers barriers for researchers and educators who might not have access to large-scale, real-world data.
## Considerations
### Bias
- Geographic Bias: Yelp's presence and popularity vary significantly across different regions. If the dataset has more extensive coverage in certain areas, the analysis might not accurately reflect regions with lower Yelp usage, leading to skewed insights about restaurant preferences or trends.
- User Demographic Bias: Yelp users may not be a representative sample of the broader population. Factors such as age, income, and tech-savviness can influence who uses Yelp and who writes reviews. This skew can affect the perceived quality or popularity of restaurants.
- Selection Bias: By focusing solely on restaurants and the first 200,000 rows of the merged dataset, there's a risk of omitting relevant data that could offer a more comprehensive understanding of consumer preferences or business performance. The initial selection process might also favor certain types of restaurants or those with more reviews, skewing the analysis.
- Rating Bias: Users who leave reviews might be more likely to do so after exceptionally positive or negative experiences, which doesn't always accurately reflect the average customer experience. This can lead to a polarization of ratings, where the data might not accurately represent the overall quality of service.
### Limitations
- Data Completeness: The dataset might not capture all restaurants or reviews, especially newer businesses or those that have not been reviewed on Yelp. This incompleteness can limit the analysis's scope and the accuracy of findings.
- Temporal Dynamics: Consumer preferences and restaurant quality can change over time. The dataset represents a snapshot, and without considering the time aspect, it might not accurately reflect current trends or the impact of external events (e.g., a pandemic).
- Memory Constraints: Limiting the analysis to the first 200,000 rows to manage memory usage could introduce sample bias, as this approach does not guarantee a random or representative sample of the entire dataset. This constraint might overlook valuable insights from the excluded data.
- Lack of External Data: By not incorporating external data sources, such as economic indicators, health inspection scores, or social media sentiment, the analysis might miss out on important factors that could influence restaurant performance or consumer preferences.
- Data Privacy and Ethics: While the dataset is curated for academic use, there's always a concern regarding user privacy and the ethical use of data, particularly in how user-generated content is analyzed and interpreted.
### Dataset Terms of Use
Yelp's dataset comes with a detailed set of terms of use, which you can review by visiting their Dataset User Agreement. The agreement can be found at the provided link: [Yelp Dataset User Agreement](https://s3-media0.fl.yelpcdn.com/assets/srv0/engineering_pages/f64cb2d3efcc/assets/vendor/Dataset_User_Agreement.pdf). This document will contain specific guidelines and restrictions that are crucial for anyone working with Yelp's dataset.
# Links
All relative links:
- Yelp raw dataset: https://www.yelp.com/dataset/download
- yelp_academic_dataset_business.json: https://yelpdata.s3.us-west-2.amazonaws.com/yelp_academic_dataset_business.json
- yelp_academic_dataset_review.json: https://yelpdata.s3.us-west-2.amazonaws.com/yelp_academic_dataset_review.json
- Data Processing: https://colab.research.google.com/drive/1r_gUGmsawwtFpZCj23X1jWjfEi6Dw291?usp=sharing
- Dataset Check: https://colab.research.google.com/drive/1ybXGIYUqJ7DH22A4apynfrWCMGzb2v_T?usp=sharing |
NekoJojo/modified_wider_face_val | ---
dataset_info:
features:
- name: image
dtype: image
- name: labels
sequence: int64
- name: bbox
sequence:
sequence: float64
- name: valid_length
dtype: int64
- name: original_size
sequence: int64
- name: resized_bbox
sequence:
sequence: float64
splits:
- name: validation
num_bytes: 381040281.125
num_examples: 3167
download_size: 355764093
dataset_size: 381040281.125
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
|
chuyin0321/extended-trading-stocks | ---
dataset_info:
features:
- name: symbol
dtype: string
- name: date
dtype: string
- name: time
dtype: string
- name: price
dtype: float64
- name: share_volume
dtype: string
splits:
- name: train
num_bytes: 4680296
num_examples: 98899
download_size: 824886
dataset_size: 4680296
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "extended-trading-stocks"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sxiong/Power-equipment-image-dataset | ---
license: mit
---
## Power-equipment-image-dataset
This repository contains the power equipment image data. If you find it useful, please cite our paper.
## Citation
```
@article{xiong2021object,
title={Object recognition for power equipment via human-level concept learning},
author={Xiong, Siheng and Liu, Yadong and Yan, Yingjie and Pei, Ling and Xu, Peng and Fu, Xiaofei and Jiang, Xiuchen},
journal={IET Generation, Transmission \& Distribution},
volume={15},
number={10},
pages={1578--1587},
year={2021},
publisher={Wiley Online Library}
}
```
|
pccl-org/formal-logic-simple-order-new-objects-paired-bigger-5000 | ---
dataset_info:
features:
- name: greater_than
dtype: string
- name: less_than
dtype: string
- name: paired_example
sequence:
sequence: string
- name: correct_example
sequence: string
- name: incorrect_example
sequence: string
- name: distance
dtype: int64
- name: index
dtype: int64
- name: index_in_distance
dtype: int64
splits:
- name: train
num_bytes: 3166998759
num_examples: 12492503
download_size: 1120426911
dataset_size: 3166998759
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "formal-logic-simple-order-new-objects-paired-bigger-5000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Mrdanizm/Mestablediffusion | ---
license: other
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.