datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
316usman/thematic2a_rr | ---
dataset_info:
features:
- name: text
dtype: string
- name: document_url
dtype: string
- name: source_url
dtype: string
- name: num_tokens
dtype: int64
splits:
- name: train
num_bytes: 53695426.810901806
num_examples: 84124
download_size: 18478592
dataset_size: 53695426.810901806
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Fermat111/FOL | ---
license: apache-2.0
---
|
mikeion/dissertation_data_with_split | ---
dataset_info:
features:
- name: conversation_id
dtype: int64
- name: help_channel
dtype: string
- name: __rowid__
dtype: string
- name: author_id
dtype: int64
- name: author_name
dtype: string
- name: timestamp
dtype: string
- name: content
dtype: string
- name: reference.messageId
dtype: string
- name: reference.channelId
dtype: string
- name: reference.guildId
dtype: string
- name: url
dtype: string
- name: fileName
dtype: string
- name: student
dtype: int64
- name: helper
dtype: int64
- name: references.id
dtype: float64
- name: references.name
dtype: string
- name: references.discriminator
dtype: string
- name: references.nickname
dtype: string
- name: references.isBot
dtype: bool
splits:
- name: train
num_bytes: 1137105139
num_examples: 5610163
download_size: 267837450
dataset_size: 1137105139
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nthngdy/culturax_fr_metrics | ---
dataset_info:
features:
- name: text
dtype: string
- name: timestamp
dtype: string
- name: url
dtype: string
- name: source
dtype: string
- name: oscar_ppl
dtype: float64
- name: wiki_ppl
dtype: float64
- name: char_length
dtype: int64
splits:
- name: train
num_bytes: 368624753
num_examples: 100000
download_size: 224697431
dataset_size: 368624753
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Lucianopacheco/rayner01 | ---
license: apache-2.0
---
|
tyzhu/squad_qa_num_v5_full_recite_ans_sent | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 7745401
num_examples: 5070
- name: validation
num_bytes: 403389
num_examples: 300
download_size: 0
dataset_size: 8148790
---
# Dataset Card for "squad_qa_num_v5_full_recite_ans_sent"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_stsb_remove_det_definite | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 119889
num_examples: 699
- name: test
num_bytes: 75537
num_examples: 470
- name: train
num_bytes: 324357
num_examples: 1851
download_size: 329841
dataset_size: 519783
---
# Dataset Card for "MULTI_VALUE_stsb_remove_det_definite"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
indonlp/nusaparagraph_emot | ---
license: apache-2.0
---
|
AndyLiu0104/Soldering-Data-Tiny-appearance_hole-0731 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 10581.0
num_examples: 6
download_size: 11668
dataset_size: 10581.0
---
# Dataset Card for "Soldering-Data-Tiny-appearance_hole-0731"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EgilKarlsen/AA_RoBERTa_Finetuned | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 80318780.21618997
num_examples: 26057
- name: test
num_bytes: 26774087.073587257
num_examples: 8686
download_size: 147169115
dataset_size: 107092867.28977722
---
# Dataset Card for "AA_RoBERTa_Finetuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-professional_psychology-neg-prepend-fix | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 7999
num_examples: 5
- name: test
num_bytes: 2096464
num_examples: 612
download_size: 14733
dataset_size: 2104463
---
# Dataset Card for "mmlu-professional_psychology-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fantasticrambo/covid-tweet-sentiment-analyzer-distilbert-data | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
dtype: int64
splits:
- name: train
num_bytes: 10366704
num_examples: 7999
- name: val
num_bytes: 2592000
num_examples: 2000
download_size: 514530
dataset_size: 12958704
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
---
|
open-llm-leaderboard/details_edor__Hermes-Platypus2-mini-7B | ---
pretty_name: Evaluation run of edor/Hermes-Platypus2-mini-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [edor/Hermes-Platypus2-mini-7B](https://huggingface.co/edor/Hermes-Platypus2-mini-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_edor__Hermes-Platypus2-mini-7B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-16T10:47:02.037059](https://huggingface.co/datasets/open-llm-leaderboard/details_edor__Hermes-Platypus2-mini-7B/blob/main/results_2023-08-16T10%3A47%3A02.037059.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4739285188775824,\n\
\ \"acc_stderr\": 0.035185125877572575,\n \"acc_norm\": 0.4774082437104984,\n\
\ \"acc_norm_stderr\": 0.035170487487277746,\n \"mc1\": 0.3329253365973072,\n\
\ \"mc1_stderr\": 0.016497402382012055,\n \"mc2\": 0.49276058409873585,\n\
\ \"mc2_stderr\": 0.01516224977207343\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.523037542662116,\n \"acc_stderr\": 0.014595873205358269,\n\
\ \"acc_norm\": 0.537542662116041,\n \"acc_norm_stderr\": 0.014570144495075581\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6015733917546305,\n\
\ \"acc_stderr\": 0.004885735963346904,\n \"acc_norm\": 0.7923720374427405,\n\
\ \"acc_norm_stderr\": 0.0040477996462346365\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.040179012759817494,\n\
\ \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.040179012759817494\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5056603773584906,\n \"acc_stderr\": 0.030770900763851316,\n\
\ \"acc_norm\": 0.5056603773584906,\n \"acc_norm_stderr\": 0.030770900763851316\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04181210050035455,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04181210050035455\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4161849710982659,\n\
\ \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.4161849710982659,\n\
\ \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179962,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179962\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.43448275862068964,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.43448275862068964,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30158730158730157,\n \"acc_stderr\": 0.0236369759961018,\n \"\
acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.0236369759961018\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.042163702135578345,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.042163702135578345\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5225806451612903,\n\
\ \"acc_stderr\": 0.02841498501970786,\n \"acc_norm\": 0.5225806451612903,\n\
\ \"acc_norm_stderr\": 0.02841498501970786\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.33004926108374383,\n \"acc_stderr\": 0.033085304262282574,\n\
\ \"acc_norm\": 0.33004926108374383,\n \"acc_norm_stderr\": 0.033085304262282574\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.03793713171165635,\n\
\ \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.03793713171165635\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5707070707070707,\n \"acc_stderr\": 0.035265527246012,\n \"acc_norm\"\
: 0.5707070707070707,\n \"acc_norm_stderr\": 0.035265527246012\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.6683937823834197,\n \"acc_stderr\": 0.03397636541089118,\n\
\ \"acc_norm\": 0.6683937823834197,\n \"acc_norm_stderr\": 0.03397636541089118\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4307692307692308,\n \"acc_stderr\": 0.02510682066053975,\n \
\ \"acc_norm\": 0.4307692307692308,\n \"acc_norm_stderr\": 0.02510682066053975\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.026466117538959912,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.026466117538959912\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.031968769891957786,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.031968769891957786\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.036030385453603826,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.036030385453603826\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6440366972477064,\n \"acc_stderr\": 0.020528559278244214,\n \"\
acc_norm\": 0.6440366972477064,\n \"acc_norm_stderr\": 0.020528559278244214\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.27314814814814814,\n \"acc_stderr\": 0.030388051301678116,\n \"\
acc_norm\": 0.27314814814814814,\n \"acc_norm_stderr\": 0.030388051301678116\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6323529411764706,\n \"acc_stderr\": 0.03384132045674119,\n \"\
acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.03384132045674119\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.030685820596610805,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.030685820596610805\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5515695067264574,\n\
\ \"acc_stderr\": 0.03337883736255098,\n \"acc_norm\": 0.5515695067264574,\n\
\ \"acc_norm_stderr\": 0.03337883736255098\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5190839694656488,\n \"acc_stderr\": 0.04382094705550988,\n\
\ \"acc_norm\": 0.5190839694656488,\n \"acc_norm_stderr\": 0.04382094705550988\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6611570247933884,\n \"acc_stderr\": 0.043207678075366705,\n \"\
acc_norm\": 0.6611570247933884,\n \"acc_norm_stderr\": 0.043207678075366705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760628,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4601226993865031,\n \"acc_stderr\": 0.03915857291436971,\n\
\ \"acc_norm\": 0.4601226993865031,\n \"acc_norm_stderr\": 0.03915857291436971\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5825242718446602,\n \"acc_stderr\": 0.048828405482122375,\n\
\ \"acc_norm\": 0.5825242718446602,\n \"acc_norm_stderr\": 0.048828405482122375\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.02934311479809444,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.02934311479809444\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6577266922094508,\n\
\ \"acc_stderr\": 0.016967031766413624,\n \"acc_norm\": 0.6577266922094508,\n\
\ \"acc_norm_stderr\": 0.016967031766413624\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5346820809248555,\n \"acc_stderr\": 0.026854257928258875,\n\
\ \"acc_norm\": 0.5346820809248555,\n \"acc_norm_stderr\": 0.026854257928258875\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25251396648044694,\n\
\ \"acc_stderr\": 0.014530330201468636,\n \"acc_norm\": 0.25251396648044694,\n\
\ \"acc_norm_stderr\": 0.014530330201468636\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.49673202614379086,\n \"acc_stderr\": 0.028629305194003543,\n\
\ \"acc_norm\": 0.49673202614379086,\n \"acc_norm_stderr\": 0.028629305194003543\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5691318327974276,\n\
\ \"acc_stderr\": 0.028125340983972714,\n \"acc_norm\": 0.5691318327974276,\n\
\ \"acc_norm_stderr\": 0.028125340983972714\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5061728395061729,\n \"acc_stderr\": 0.027818623962583295,\n\
\ \"acc_norm\": 0.5061728395061729,\n \"acc_norm_stderr\": 0.027818623962583295\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3900709219858156,\n \"acc_stderr\": 0.029097675599463926,\n \
\ \"acc_norm\": 0.3900709219858156,\n \"acc_norm_stderr\": 0.029097675599463926\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3539765319426336,\n\
\ \"acc_stderr\": 0.012213504731731637,\n \"acc_norm\": 0.3539765319426336,\n\
\ \"acc_norm_stderr\": 0.012213504731731637\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.030320243265004137,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.030320243265004137\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.44607843137254904,\n \"acc_stderr\": 0.02010986454718136,\n \
\ \"acc_norm\": 0.44607843137254904,\n \"acc_norm_stderr\": 0.02010986454718136\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n\
\ \"acc_stderr\": 0.04785964010794916,\n \"acc_norm\": 0.5181818181818182,\n\
\ \"acc_norm_stderr\": 0.04785964010794916\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.563265306122449,\n \"acc_stderr\": 0.031751952375833226,\n\
\ \"acc_norm\": 0.563265306122449,\n \"acc_norm_stderr\": 0.031751952375833226\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6218905472636815,\n\
\ \"acc_stderr\": 0.034288678487786564,\n \"acc_norm\": 0.6218905472636815,\n\
\ \"acc_norm_stderr\": 0.034288678487786564\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6374269005847953,\n \"acc_stderr\": 0.0368713061556206,\n\
\ \"acc_norm\": 0.6374269005847953,\n \"acc_norm_stderr\": 0.0368713061556206\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3329253365973072,\n\
\ \"mc1_stderr\": 0.016497402382012055,\n \"mc2\": 0.49276058409873585,\n\
\ \"mc2_stderr\": 0.01516224977207343\n }\n}\n```"
repo_url: https://huggingface.co/edor/Hermes-Platypus2-mini-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|arc:challenge|25_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hellaswag|10_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-16T10:47:02.037059.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-16T10:47:02.037059.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-16T10:47:02.037059.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-16T10:47:02.037059.parquet'
- config_name: results
data_files:
- split: 2023_08_16T10_47_02.037059
path:
- results_2023-08-16T10:47:02.037059.parquet
- split: latest
path:
- results_2023-08-16T10:47:02.037059.parquet
---
# Dataset Card for Evaluation run of edor/Hermes-Platypus2-mini-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/edor/Hermes-Platypus2-mini-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [edor/Hermes-Platypus2-mini-7B](https://huggingface.co/edor/Hermes-Platypus2-mini-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_edor__Hermes-Platypus2-mini-7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-16T10:47:02.037059](https://huggingface.co/datasets/open-llm-leaderboard/details_edor__Hermes-Platypus2-mini-7B/blob/main/results_2023-08-16T10%3A47%3A02.037059.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4739285188775824,
"acc_stderr": 0.035185125877572575,
"acc_norm": 0.4774082437104984,
"acc_norm_stderr": 0.035170487487277746,
"mc1": 0.3329253365973072,
"mc1_stderr": 0.016497402382012055,
"mc2": 0.49276058409873585,
"mc2_stderr": 0.01516224977207343
},
"harness|arc:challenge|25": {
"acc": 0.523037542662116,
"acc_stderr": 0.014595873205358269,
"acc_norm": 0.537542662116041,
"acc_norm_stderr": 0.014570144495075581
},
"harness|hellaswag|10": {
"acc": 0.6015733917546305,
"acc_stderr": 0.004885735963346904,
"acc_norm": 0.7923720374427405,
"acc_norm_stderr": 0.0040477996462346365
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.040179012759817494,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.040179012759817494
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5056603773584906,
"acc_stderr": 0.030770900763851316,
"acc_norm": 0.5056603773584906,
"acc_norm_stderr": 0.030770900763851316
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5,
"acc_stderr": 0.04181210050035455,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04181210050035455
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4161849710982659,
"acc_stderr": 0.03758517775404947,
"acc_norm": 0.4161849710982659,
"acc_norm_stderr": 0.03758517775404947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179962,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179962
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.43448275862068964,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.43448275862068964,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.0236369759961018,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.0236369759961018
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.042163702135578345,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.042163702135578345
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5225806451612903,
"acc_stderr": 0.02841498501970786,
"acc_norm": 0.5225806451612903,
"acc_norm_stderr": 0.02841498501970786
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33004926108374383,
"acc_stderr": 0.033085304262282574,
"acc_norm": 0.33004926108374383,
"acc_norm_stderr": 0.033085304262282574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.03793713171165635,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.03793713171165635
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5707070707070707,
"acc_stderr": 0.035265527246012,
"acc_norm": 0.5707070707070707,
"acc_norm_stderr": 0.035265527246012
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6683937823834197,
"acc_stderr": 0.03397636541089118,
"acc_norm": 0.6683937823834197,
"acc_norm_stderr": 0.03397636541089118
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4307692307692308,
"acc_stderr": 0.02510682066053975,
"acc_norm": 0.4307692307692308,
"acc_norm_stderr": 0.02510682066053975
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.026466117538959912,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.026466117538959912
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.031968769891957786,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.031968769891957786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.036030385453603826,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.036030385453603826
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6440366972477064,
"acc_stderr": 0.020528559278244214,
"acc_norm": 0.6440366972477064,
"acc_norm_stderr": 0.020528559278244214
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.27314814814814814,
"acc_stderr": 0.030388051301678116,
"acc_norm": 0.27314814814814814,
"acc_norm_stderr": 0.030388051301678116
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.03384132045674119,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.03384132045674119
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.030685820596610805,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.030685820596610805
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5515695067264574,
"acc_stderr": 0.03337883736255098,
"acc_norm": 0.5515695067264574,
"acc_norm_stderr": 0.03337883736255098
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5190839694656488,
"acc_stderr": 0.04382094705550988,
"acc_norm": 0.5190839694656488,
"acc_norm_stderr": 0.04382094705550988
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6611570247933884,
"acc_stderr": 0.043207678075366705,
"acc_norm": 0.6611570247933884,
"acc_norm_stderr": 0.043207678075366705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760628,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4601226993865031,
"acc_stderr": 0.03915857291436971,
"acc_norm": 0.4601226993865031,
"acc_norm_stderr": 0.03915857291436971
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.5825242718446602,
"acc_stderr": 0.048828405482122375,
"acc_norm": 0.5825242718446602,
"acc_norm_stderr": 0.048828405482122375
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.02934311479809444,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.02934311479809444
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6577266922094508,
"acc_stderr": 0.016967031766413624,
"acc_norm": 0.6577266922094508,
"acc_norm_stderr": 0.016967031766413624
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5346820809248555,
"acc_stderr": 0.026854257928258875,
"acc_norm": 0.5346820809248555,
"acc_norm_stderr": 0.026854257928258875
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25251396648044694,
"acc_stderr": 0.014530330201468636,
"acc_norm": 0.25251396648044694,
"acc_norm_stderr": 0.014530330201468636
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.49673202614379086,
"acc_stderr": 0.028629305194003543,
"acc_norm": 0.49673202614379086,
"acc_norm_stderr": 0.028629305194003543
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5691318327974276,
"acc_stderr": 0.028125340983972714,
"acc_norm": 0.5691318327974276,
"acc_norm_stderr": 0.028125340983972714
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5061728395061729,
"acc_stderr": 0.027818623962583295,
"acc_norm": 0.5061728395061729,
"acc_norm_stderr": 0.027818623962583295
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3900709219858156,
"acc_stderr": 0.029097675599463926,
"acc_norm": 0.3900709219858156,
"acc_norm_stderr": 0.029097675599463926
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3539765319426336,
"acc_stderr": 0.012213504731731637,
"acc_norm": 0.3539765319426336,
"acc_norm_stderr": 0.012213504731731637
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.030320243265004137,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.030320243265004137
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.44607843137254904,
"acc_stderr": 0.02010986454718136,
"acc_norm": 0.44607843137254904,
"acc_norm_stderr": 0.02010986454718136
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5181818181818182,
"acc_stderr": 0.04785964010794916,
"acc_norm": 0.5181818181818182,
"acc_norm_stderr": 0.04785964010794916
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.563265306122449,
"acc_stderr": 0.031751952375833226,
"acc_norm": 0.563265306122449,
"acc_norm_stderr": 0.031751952375833226
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6218905472636815,
"acc_stderr": 0.034288678487786564,
"acc_norm": 0.6218905472636815,
"acc_norm_stderr": 0.034288678487786564
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6374269005847953,
"acc_stderr": 0.0368713061556206,
"acc_norm": 0.6374269005847953,
"acc_norm_stderr": 0.0368713061556206
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3329253365973072,
"mc1_stderr": 0.016497402382012055,
"mc2": 0.49276058409873585,
"mc2_stderr": 0.01516224977207343
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
anan-2024/twitter_dataset_1713147740 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 141931
num_examples: 390
download_size: 80099
dataset_size: 141931
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_kyujinpy__PlatYi-34B-200k-Q-FastChat | ---
pretty_name: Evaluation run of kyujinpy/PlatYi-34B-200k-Q-FastChat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [kyujinpy/PlatYi-34B-200k-Q-FastChat](https://huggingface.co/kyujinpy/PlatYi-34B-200k-Q-FastChat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kyujinpy__PlatYi-34B-200k-Q-FastChat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-10T08:30:20.014698](https://huggingface.co/datasets/open-llm-leaderboard/details_kyujinpy__PlatYi-34B-200k-Q-FastChat/blob/main/results_2023-12-10T08-30-20.014698.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7630727247628006,\n\
\ \"acc_stderr\": 0.028221206890446823,\n \"acc_norm\": 0.770488792020382,\n\
\ \"acc_norm_stderr\": 0.028732290582792492,\n \"mc1\": 0.34761321909424725,\n\
\ \"mc1_stderr\": 0.016670769188897303,\n \"mc2\": 0.4838395775572536,\n\
\ \"mc2_stderr\": 0.014874467350764172\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.613481228668942,\n \"acc_stderr\": 0.014230084761910471,\n\
\ \"acc_norm\": 0.6493174061433447,\n \"acc_norm_stderr\": 0.013944635930726097\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6467835092611034,\n\
\ \"acc_stderr\": 0.004769924131304649,\n \"acc_norm\": 0.8445528779127663,\n\
\ \"acc_norm_stderr\": 0.003615898928269288\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7185185185185186,\n\
\ \"acc_stderr\": 0.03885004245800253,\n \"acc_norm\": 0.7185185185185186,\n\
\ \"acc_norm_stderr\": 0.03885004245800253\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.875,\n \"acc_stderr\": 0.026913523521537846,\n \
\ \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.026913523521537846\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8113207547169812,\n \"acc_stderr\": 0.02407999513006225,\n\
\ \"acc_norm\": 0.8113207547169812,\n \"acc_norm_stderr\": 0.02407999513006225\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8958333333333334,\n\
\ \"acc_stderr\": 0.025545239210256917,\n \"acc_norm\": 0.8958333333333334,\n\
\ \"acc_norm_stderr\": 0.025545239210256917\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5686274509803921,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7829787234042553,\n \"acc_stderr\": 0.026947483121496228,\n\
\ \"acc_norm\": 0.7829787234042553,\n \"acc_norm_stderr\": 0.026947483121496228\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6403508771929824,\n\
\ \"acc_stderr\": 0.04514496132873633,\n \"acc_norm\": 0.6403508771929824,\n\
\ \"acc_norm_stderr\": 0.04514496132873633\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7517241379310344,\n \"acc_stderr\": 0.03600105692727771,\n\
\ \"acc_norm\": 0.7517241379310344,\n \"acc_norm_stderr\": 0.03600105692727771\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.7380952380952381,\n \"acc_stderr\": 0.022644212615525218,\n \"\
acc_norm\": 0.7380952380952381,\n \"acc_norm_stderr\": 0.022644212615525218\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5317460317460317,\n\
\ \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.5317460317460317,\n\
\ \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.896774193548387,\n\
\ \"acc_stderr\": 0.017308381281034527,\n \"acc_norm\": 0.896774193548387,\n\
\ \"acc_norm_stderr\": 0.017308381281034527\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6650246305418719,\n \"acc_stderr\": 0.033208527423483104,\n\
\ \"acc_norm\": 0.6650246305418719,\n \"acc_norm_stderr\": 0.033208527423483104\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\"\
: 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865397,\n\
\ \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865397\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9444444444444444,\n \"acc_stderr\": 0.0163199507007674,\n \"acc_norm\"\
: 0.9444444444444444,\n \"acc_norm_stderr\": 0.0163199507007674\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9637305699481865,\n \"acc_stderr\": 0.013492659751295127,\n\
\ \"acc_norm\": 0.9637305699481865,\n \"acc_norm_stderr\": 0.013492659751295127\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8153846153846154,\n \"acc_stderr\": 0.0196716324131003,\n \
\ \"acc_norm\": 0.8153846153846154,\n \"acc_norm_stderr\": 0.0196716324131003\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.43703703703703706,\n \"acc_stderr\": 0.030242862397654,\n \
\ \"acc_norm\": 0.43703703703703706,\n \"acc_norm_stderr\": 0.030242862397654\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8613445378151261,\n \"acc_stderr\": 0.02244826447683258,\n \
\ \"acc_norm\": 0.8613445378151261,\n \"acc_norm_stderr\": 0.02244826447683258\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5165562913907285,\n \"acc_stderr\": 0.04080244185628972,\n \"\
acc_norm\": 0.5165562913907285,\n \"acc_norm_stderr\": 0.04080244185628972\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9302752293577982,\n \"acc_stderr\": 0.010919426411848607,\n \"\
acc_norm\": 0.9302752293577982,\n \"acc_norm_stderr\": 0.010919426411848607\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6759259259259259,\n \"acc_stderr\": 0.03191923445686186,\n \"\
acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.03191923445686186\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089678,\n \"\
acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089678\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065498,\n \
\ \"acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065498\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8026905829596412,\n\
\ \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.8026905829596412,\n\
\ \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804475,\n\
\ \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804475\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035216,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035216\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8703703703703703,\n\
\ \"acc_stderr\": 0.032472243899179465,\n \"acc_norm\": 0.8703703703703703,\n\
\ \"acc_norm_stderr\": 0.032472243899179465\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.852760736196319,\n \"acc_stderr\": 0.027839915278339653,\n\
\ \"acc_norm\": 0.852760736196319,\n \"acc_norm_stderr\": 0.027839915278339653\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6428571428571429,\n\
\ \"acc_stderr\": 0.045479609997643757,\n \"acc_norm\": 0.6428571428571429,\n\
\ \"acc_norm_stderr\": 0.045479609997643757\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8932038834951457,\n \"acc_stderr\": 0.030581088928331356,\n\
\ \"acc_norm\": 0.8932038834951457,\n \"acc_norm_stderr\": 0.030581088928331356\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n\
\ \"acc_stderr\": 0.015537514263253867,\n \"acc_norm\": 0.9401709401709402,\n\
\ \"acc_norm_stderr\": 0.015537514263253867\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9144316730523627,\n\
\ \"acc_stderr\": 0.010002965568647286,\n \"acc_norm\": 0.9144316730523627,\n\
\ \"acc_norm_stderr\": 0.010002965568647286\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.815028901734104,\n \"acc_stderr\": 0.020903975842083027,\n\
\ \"acc_norm\": 0.815028901734104,\n \"acc_norm_stderr\": 0.020903975842083027\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7262569832402235,\n\
\ \"acc_stderr\": 0.014912413096372432,\n \"acc_norm\": 0.7262569832402235,\n\
\ \"acc_norm_stderr\": 0.014912413096372432\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8627450980392157,\n \"acc_stderr\": 0.01970403918385981,\n\
\ \"acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.01970403918385981\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.842443729903537,\n\
\ \"acc_stderr\": 0.020692237273583984,\n \"acc_norm\": 0.842443729903537,\n\
\ \"acc_norm_stderr\": 0.020692237273583984\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8734567901234568,\n \"acc_stderr\": 0.018498600558790906,\n\
\ \"acc_norm\": 0.8734567901234568,\n \"acc_norm_stderr\": 0.018498600558790906\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6205673758865248,\n \"acc_stderr\": 0.028947338851614095,\n \
\ \"acc_norm\": 0.6205673758865248,\n \"acc_norm_stderr\": 0.028947338851614095\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6173402868318123,\n\
\ \"acc_stderr\": 0.01241359588289327,\n \"acc_norm\": 0.6173402868318123,\n\
\ \"acc_norm_stderr\": 0.01241359588289327\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8125,\n \"acc_stderr\": 0.023709788253811766,\n \
\ \"acc_norm\": 0.8125,\n \"acc_norm_stderr\": 0.023709788253811766\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8202614379084967,\n \"acc_stderr\": 0.01553374508338279,\n \
\ \"acc_norm\": 0.8202614379084967,\n \"acc_norm_stderr\": 0.01553374508338279\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7636363636363637,\n\
\ \"acc_stderr\": 0.04069306319721376,\n \"acc_norm\": 0.7636363636363637,\n\
\ \"acc_norm_stderr\": 0.04069306319721376\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8285714285714286,\n \"acc_stderr\": 0.024127463462650163,\n\
\ \"acc_norm\": 0.8285714285714286,\n \"acc_norm_stderr\": 0.024127463462650163\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n\
\ \"acc_stderr\": 0.022509345325101716,\n \"acc_norm\": 0.8855721393034826,\n\
\ \"acc_norm_stderr\": 0.022509345325101716\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366152,\n\
\ \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366152\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34761321909424725,\n\
\ \"mc1_stderr\": 0.016670769188897303,\n \"mc2\": 0.4838395775572536,\n\
\ \"mc2_stderr\": 0.014874467350764172\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8074191002367798,\n \"acc_stderr\": 0.01108253884749189\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.514783927217589,\n \
\ \"acc_stderr\": 0.0137664630507876\n }\n}\n```"
repo_url: https://huggingface.co/kyujinpy/PlatYi-34B-200k-Q-FastChat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|arc:challenge|25_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|gsm8k|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hellaswag|10_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T08-30-20.014698.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-10T08-30-20.014698.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- '**/details_harness|winogrande|5_2023-12-10T08-30-20.014698.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-10T08-30-20.014698.parquet'
- config_name: results
data_files:
- split: 2023_12_10T08_30_20.014698
path:
- results_2023-12-10T08-30-20.014698.parquet
- split: latest
path:
- results_2023-12-10T08-30-20.014698.parquet
---
# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-200k-Q-FastChat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/kyujinpy/PlatYi-34B-200k-Q-FastChat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [kyujinpy/PlatYi-34B-200k-Q-FastChat](https://huggingface.co/kyujinpy/PlatYi-34B-200k-Q-FastChat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kyujinpy__PlatYi-34B-200k-Q-FastChat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T08:30:20.014698](https://huggingface.co/datasets/open-llm-leaderboard/details_kyujinpy__PlatYi-34B-200k-Q-FastChat/blob/main/results_2023-12-10T08-30-20.014698.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7630727247628006,
"acc_stderr": 0.028221206890446823,
"acc_norm": 0.770488792020382,
"acc_norm_stderr": 0.028732290582792492,
"mc1": 0.34761321909424725,
"mc1_stderr": 0.016670769188897303,
"mc2": 0.4838395775572536,
"mc2_stderr": 0.014874467350764172
},
"harness|arc:challenge|25": {
"acc": 0.613481228668942,
"acc_stderr": 0.014230084761910471,
"acc_norm": 0.6493174061433447,
"acc_norm_stderr": 0.013944635930726097
},
"harness|hellaswag|10": {
"acc": 0.6467835092611034,
"acc_stderr": 0.004769924131304649,
"acc_norm": 0.8445528779127663,
"acc_norm_stderr": 0.003615898928269288
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7185185185185186,
"acc_stderr": 0.03885004245800253,
"acc_norm": 0.7185185185185186,
"acc_norm_stderr": 0.03885004245800253
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.875,
"acc_stderr": 0.026913523521537846,
"acc_norm": 0.875,
"acc_norm_stderr": 0.026913523521537846
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8113207547169812,
"acc_stderr": 0.02407999513006225,
"acc_norm": 0.8113207547169812,
"acc_norm_stderr": 0.02407999513006225
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8958333333333334,
"acc_stderr": 0.025545239210256917,
"acc_norm": 0.8958333333333334,
"acc_norm_stderr": 0.025545239210256917
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7829787234042553,
"acc_stderr": 0.026947483121496228,
"acc_norm": 0.7829787234042553,
"acc_norm_stderr": 0.026947483121496228
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6403508771929824,
"acc_stderr": 0.04514496132873633,
"acc_norm": 0.6403508771929824,
"acc_norm_stderr": 0.04514496132873633
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7517241379310344,
"acc_stderr": 0.03600105692727771,
"acc_norm": 0.7517241379310344,
"acc_norm_stderr": 0.03600105692727771
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7380952380952381,
"acc_stderr": 0.022644212615525218,
"acc_norm": 0.7380952380952381,
"acc_norm_stderr": 0.022644212615525218
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5317460317460317,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.5317460317460317,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.896774193548387,
"acc_stderr": 0.017308381281034527,
"acc_norm": 0.896774193548387,
"acc_norm_stderr": 0.017308381281034527
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6650246305418719,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.6650246305418719,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.027045948825865397,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.027045948825865397
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9444444444444444,
"acc_stderr": 0.0163199507007674,
"acc_norm": 0.9444444444444444,
"acc_norm_stderr": 0.0163199507007674
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9637305699481865,
"acc_stderr": 0.013492659751295127,
"acc_norm": 0.9637305699481865,
"acc_norm_stderr": 0.013492659751295127
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8153846153846154,
"acc_stderr": 0.0196716324131003,
"acc_norm": 0.8153846153846154,
"acc_norm_stderr": 0.0196716324131003
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.030242862397654,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.030242862397654
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8613445378151261,
"acc_stderr": 0.02244826447683258,
"acc_norm": 0.8613445378151261,
"acc_norm_stderr": 0.02244826447683258
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5165562913907285,
"acc_stderr": 0.04080244185628972,
"acc_norm": 0.5165562913907285,
"acc_norm_stderr": 0.04080244185628972
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9302752293577982,
"acc_stderr": 0.010919426411848607,
"acc_norm": 0.9302752293577982,
"acc_norm_stderr": 0.010919426411848607
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.03191923445686186,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.03191923445686186
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.018318855850089678,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.018318855850089678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8987341772151899,
"acc_stderr": 0.019637720526065498,
"acc_norm": 0.8987341772151899,
"acc_norm_stderr": 0.019637720526065498
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8026905829596412,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.8026905829596412,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8625954198473282,
"acc_stderr": 0.030194823996804475,
"acc_norm": 0.8625954198473282,
"acc_norm_stderr": 0.030194823996804475
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035216,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035216
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8703703703703703,
"acc_stderr": 0.032472243899179465,
"acc_norm": 0.8703703703703703,
"acc_norm_stderr": 0.032472243899179465
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.852760736196319,
"acc_stderr": 0.027839915278339653,
"acc_norm": 0.852760736196319,
"acc_norm_stderr": 0.027839915278339653
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.045479609997643757,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.045479609997643757
},
"harness|hendrycksTest-management|5": {
"acc": 0.8932038834951457,
"acc_stderr": 0.030581088928331356,
"acc_norm": 0.8932038834951457,
"acc_norm_stderr": 0.030581088928331356
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9401709401709402,
"acc_stderr": 0.015537514263253867,
"acc_norm": 0.9401709401709402,
"acc_norm_stderr": 0.015537514263253867
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9144316730523627,
"acc_stderr": 0.010002965568647286,
"acc_norm": 0.9144316730523627,
"acc_norm_stderr": 0.010002965568647286
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.815028901734104,
"acc_stderr": 0.020903975842083027,
"acc_norm": 0.815028901734104,
"acc_norm_stderr": 0.020903975842083027
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7262569832402235,
"acc_stderr": 0.014912413096372432,
"acc_norm": 0.7262569832402235,
"acc_norm_stderr": 0.014912413096372432
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.01970403918385981,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.01970403918385981
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.842443729903537,
"acc_stderr": 0.020692237273583984,
"acc_norm": 0.842443729903537,
"acc_norm_stderr": 0.020692237273583984
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8734567901234568,
"acc_stderr": 0.018498600558790906,
"acc_norm": 0.8734567901234568,
"acc_norm_stderr": 0.018498600558790906
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6205673758865248,
"acc_stderr": 0.028947338851614095,
"acc_norm": 0.6205673758865248,
"acc_norm_stderr": 0.028947338851614095
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6173402868318123,
"acc_stderr": 0.01241359588289327,
"acc_norm": 0.6173402868318123,
"acc_norm_stderr": 0.01241359588289327
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8125,
"acc_stderr": 0.023709788253811766,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.023709788253811766
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8202614379084967,
"acc_stderr": 0.01553374508338279,
"acc_norm": 0.8202614379084967,
"acc_norm_stderr": 0.01553374508338279
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.04069306319721376,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.04069306319721376
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8285714285714286,
"acc_stderr": 0.024127463462650163,
"acc_norm": 0.8285714285714286,
"acc_norm_stderr": 0.024127463462650163
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101716,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101716
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.024648068961366152,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.024648068961366152
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34761321909424725,
"mc1_stderr": 0.016670769188897303,
"mc2": 0.4838395775572536,
"mc2_stderr": 0.014874467350764172
},
"harness|winogrande|5": {
"acc": 0.8074191002367798,
"acc_stderr": 0.01108253884749189
},
"harness|gsm8k|5": {
"acc": 0.514783927217589,
"acc_stderr": 0.0137664630507876
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
LLMPrompGenAI/LLMPrompts | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 6547
num_examples: 10
download_size: 9328
dataset_size: 6547
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
chenmingxuan/Chinese-Patent-Summary | ---
license: apache-2.0
task_categories:
- summarization
language:
- zh
---
高质量中文专利摘要数据集。 |
wiki_hop | ---
annotations_creators:
- crowdsourced
language_creators:
- expert-generated
language:
- en
license:
- cc-by-sa-3.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- question-answering
task_ids:
- extractive-qa
paperswithcode_id: wikihop
pretty_name: WikiHop
tags:
- multi-hop
dataset_info:
- config_name: original
features:
- name: id
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
- name: candidates
sequence: string
- name: supports
sequence: string
- name: annotations
sequence:
sequence: string
splits:
- name: train
num_bytes: 325952974
num_examples: 43738
- name: validation
num_bytes: 41246536
num_examples: 5129
download_size: 339843061
dataset_size: 367199510
- config_name: masked
features:
- name: id
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: candidates
sequence: string
- name: supports
sequence: string
- name: annotations
sequence:
sequence: string
splits:
- name: train
num_bytes: 348249138
num_examples: 43738
- name: validation
num_bytes: 44066862
num_examples: 5129
download_size: 339843061
dataset_size: 392316000
---
# Dataset Card for WikiHop
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [QAngaroo](http://qangaroo.cs.ucl.ac.uk/)
- **Repository:** [If the dataset is hosted on github or has a github homepage, add URL here]()
- **Paper:** [Constructing Datasets for Multi-hop Reading Comprehension Across Documents](https://arxiv.org/abs/1710.06481)
- **Leaderboard:** [leaderboard](http://qangaroo.cs.ucl.ac.uk/leaderboard.html)
- **Point of Contact:** [Johannes Welbl](j.welbl@cs.ucl.ac.uk)
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@patil-suraj](https://github.com/patil-suraj) for adding this dataset. |
rajshekar2591/testing | ---
license: afl-3.0
---
|
HuggingFaceH4/OpenHermes-2.5-preferences-v0-deduped | ---
dataset_info:
features:
- name: source
dtype: string
- name: category
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: chosen_policy
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected_policy
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: dataset
dtype: string
- name: token_count
dtype: int64
splits:
- name: train
num_bytes: 4205748183
num_examples: 761597
- name: test
num_bytes: 221026731
num_examples: 40084
download_size: 2282679668
dataset_size: 4426774914
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
HamdanXI/lj_speech_DifferentStructure | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 22050
- name: file
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1360795953.0
num_examples: 4620
- name: test
num_bytes: 490267914.2
num_examples: 1680
download_size: 1828318164
dataset_size: 1851063867.2
---
# Dataset Card for "lj_speech_DifferentStructure"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
daitavan/donut-deu | ---
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 3962318979.458
num_examples: 42621
- name: validation
num_bytes: 487693636.745
num_examples: 5389
- name: test
num_bytes: 489415605.64
num_examples: 5370
download_size: 4805277480
dataset_size: 4939428221.843
---
# Dataset Card for "donut-deu"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
qazisaad/llama_2_optimized_product_titles-esci-part2 | ---
dataset_info:
features:
- name: level_0
dtype: int64
- name: index
dtype: int64
- name: product_title
dtype: string
- name: average_score
dtype: float64
- name: total_score
dtype: float64
- name: text
dtype: string
- name: preds
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1526227
num_examples: 480
download_size: 300628
dataset_size: 1526227
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama_2_optimized_product_titles-esci-part2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
satwikapaul/painting_movements_2 | ---
license: openrail
---
|
truong-xuan-linh/zola | ---
dataset_info:
features:
- name: bannerImage
dtype: image
- name: en_caption
dtype: string
- name: concat_caption
dtype: string
splits:
- name: train
num_bytes: 49802715.406
num_examples: 1362
download_size: 48774124
dataset_size: 49802715.406
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "zola"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_vilm__Quyen-Plus-v0.1 | ---
pretty_name: Evaluation run of vilm/Quyen-Plus-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [vilm/Quyen-Plus-v0.1](https://huggingface.co/vilm/Quyen-Plus-v0.1) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vilm__Quyen-Plus-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-23T13:21:36.966160](https://huggingface.co/datasets/open-llm-leaderboard/details_vilm__Quyen-Plus-v0.1/blob/main/results_2024-02-23T13-21-36.966160.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6046866876953217,\n\
\ \"acc_stderr\": 0.03337575997237334,\n \"acc_norm\": 0.6067738889060733,\n\
\ \"acc_norm_stderr\": 0.03404792409309952,\n \"mc1\": 0.37454100367197063,\n\
\ \"mc1_stderr\": 0.01694353512840533,\n \"mc2\": 0.53603460375601,\n\
\ \"mc2_stderr\": 0.015483045221053964\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5213310580204779,\n \"acc_stderr\": 0.014598087973127106,\n\
\ \"acc_norm\": 0.5571672354948806,\n \"acc_norm_stderr\": 0.014515573873348894\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5941047600079665,\n\
\ \"acc_stderr\": 0.004900608529778612,\n \"acc_norm\": 0.785202150965943,\n\
\ \"acc_norm_stderr\": 0.004098427158949247\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.03894734487013317,\n\
\ \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.03894734487013317\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.02937364625323469,\n\
\ \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.02937364625323469\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6458333333333334,\n\
\ \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.6458333333333334,\n\
\ \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.032436186361081004,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.032436186361081004\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\
\ \"acc_stderr\": 0.04630653203366595,\n \"acc_norm\": 0.41228070175438597,\n\
\ \"acc_norm_stderr\": 0.04630653203366595\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04082482904638628,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04082482904638628\n },\n\
\ \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5026455026455027,\n\
\ \"acc_stderr\": 0.02575094967813038,\n \"acc_norm\": 0.5026455026455027,\n\
\ \"acc_norm_stderr\": 0.02575094967813038\n },\n \"harness|hendrycksTest-formal_logic|5\"\
: {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n\
\ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n\
\ },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.7193548387096774,\n \"acc_stderr\": 0.025560604721022895,\n\
\ \"acc_norm\": 0.7193548387096774,\n \"acc_norm_stderr\": 0.025560604721022895\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.0303137105381989,\n \"acc_norm\"\
: 0.7626262626262627,\n \"acc_norm_stderr\": 0.0303137105381989\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153303,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153303\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5923076923076923,\n \"acc_stderr\": 0.024915243985987847,\n\
\ \"acc_norm\": 0.5923076923076923,\n \"acc_norm_stderr\": 0.024915243985987847\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228412,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228412\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3973509933774834,\n \"acc_stderr\": 0.039955240076816806,\n \"\
acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.039955240076816806\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7981651376146789,\n \"acc_stderr\": 0.017208579357787614,\n \"\
acc_norm\": 0.7981651376146789,\n \"acc_norm_stderr\": 0.017208579357787614\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7549019607843137,\n\
\ \"acc_stderr\": 0.03019028245350194,\n \"acc_norm\": 0.7549019607843137,\n\
\ \"acc_norm_stderr\": 0.03019028245350194\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7637130801687764,\n \"acc_stderr\": 0.02765215314415926,\n\
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.02765215314415926\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946315,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946315\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7471264367816092,\n\
\ \"acc_stderr\": 0.015543377313719681,\n \"acc_norm\": 0.7471264367816092,\n\
\ \"acc_norm_stderr\": 0.015543377313719681\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.025361168749688225,\n\
\ \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.025361168749688225\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3195530726256983,\n\
\ \"acc_stderr\": 0.015595520294147411,\n \"acc_norm\": 0.3195530726256983,\n\
\ \"acc_norm_stderr\": 0.015595520294147411\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.027363593284684965,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.027363593284684965\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6591639871382636,\n\
\ \"acc_stderr\": 0.02692084126077616,\n \"acc_norm\": 0.6591639871382636,\n\
\ \"acc_norm_stderr\": 0.02692084126077616\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6574074074074074,\n \"acc_stderr\": 0.02640614597362568,\n\
\ \"acc_norm\": 0.6574074074074074,\n \"acc_norm_stderr\": 0.02640614597362568\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41843971631205673,\n \"acc_stderr\": 0.02942799403941999,\n \
\ \"acc_norm\": 0.41843971631205673,\n \"acc_norm_stderr\": 0.02942799403941999\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4276401564537158,\n\
\ \"acc_stderr\": 0.012635799922765848,\n \"acc_norm\": 0.4276401564537158,\n\
\ \"acc_norm_stderr\": 0.012635799922765848\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5073529411764706,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.5073529411764706,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5784313725490197,\n \"acc_stderr\": 0.019977422600227477,\n \
\ \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.019977422600227477\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425464,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425464\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6693877551020408,\n \"acc_stderr\": 0.030116426296540603,\n\
\ \"acc_norm\": 0.6693877551020408,\n \"acc_norm_stderr\": 0.030116426296540603\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.031885780176863984,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.031885780176863984\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37454100367197063,\n\
\ \"mc1_stderr\": 0.01694353512840533,\n \"mc2\": 0.53603460375601,\n\
\ \"mc2_stderr\": 0.015483045221053964\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.712707182320442,\n \"acc_stderr\": 0.012717481052478039\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6004548900682335,\n \
\ \"acc_stderr\": 0.01349166029881599\n }\n}\n```"
repo_url: https://huggingface.co/vilm/Quyen-Plus-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|arc:challenge|25_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|gsm8k|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hellaswag|10_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T13-21-36.966160.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-23T13-21-36.966160.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- '**/details_harness|winogrande|5_2024-02-23T13-21-36.966160.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-23T13-21-36.966160.parquet'
- config_name: results
data_files:
- split: 2024_02_23T13_21_36.966160
path:
- results_2024-02-23T13-21-36.966160.parquet
- split: latest
path:
- results_2024-02-23T13-21-36.966160.parquet
---
# Dataset Card for Evaluation run of vilm/Quyen-Plus-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [vilm/Quyen-Plus-v0.1](https://huggingface.co/vilm/Quyen-Plus-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vilm__Quyen-Plus-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-23T13:21:36.966160](https://huggingface.co/datasets/open-llm-leaderboard/details_vilm__Quyen-Plus-v0.1/blob/main/results_2024-02-23T13-21-36.966160.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6046866876953217,
"acc_stderr": 0.03337575997237334,
"acc_norm": 0.6067738889060733,
"acc_norm_stderr": 0.03404792409309952,
"mc1": 0.37454100367197063,
"mc1_stderr": 0.01694353512840533,
"mc2": 0.53603460375601,
"mc2_stderr": 0.015483045221053964
},
"harness|arc:challenge|25": {
"acc": 0.5213310580204779,
"acc_stderr": 0.014598087973127106,
"acc_norm": 0.5571672354948806,
"acc_norm_stderr": 0.014515573873348894
},
"harness|hellaswag|10": {
"acc": 0.5941047600079665,
"acc_stderr": 0.004900608529778612,
"acc_norm": 0.785202150965943,
"acc_norm_stderr": 0.004098427158949247
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.03894734487013317,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.03894734487013317
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6490566037735849,
"acc_stderr": 0.02937364625323469,
"acc_norm": 0.6490566037735849,
"acc_norm_stderr": 0.02937364625323469
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6458333333333334,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.6458333333333334,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.0372424959581773,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.0372424959581773
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.032436186361081004,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.032436186361081004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.04630653203366595,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.04630653203366595
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.04082482904638628,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04082482904638628
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5026455026455027,
"acc_stderr": 0.02575094967813038,
"acc_norm": 0.5026455026455027,
"acc_norm_stderr": 0.02575094967813038
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7193548387096774,
"acc_stderr": 0.025560604721022895,
"acc_norm": 0.7193548387096774,
"acc_norm_stderr": 0.025560604721022895
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.0303137105381989,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.0303137105381989
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153303,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153303
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5923076923076923,
"acc_stderr": 0.024915243985987847,
"acc_norm": 0.5923076923076923,
"acc_norm_stderr": 0.024915243985987847
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228412,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228412
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.039955240076816806,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.039955240076816806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7981651376146789,
"acc_stderr": 0.017208579357787614,
"acc_norm": 0.7981651376146789,
"acc_norm_stderr": 0.017208579357787614
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.03019028245350194,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.03019028245350194
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.02765215314415926,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.02765215314415926
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946315,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946315
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7471264367816092,
"acc_stderr": 0.015543377313719681,
"acc_norm": 0.7471264367816092,
"acc_norm_stderr": 0.015543377313719681
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.025361168749688225,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.025361168749688225
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3195530726256983,
"acc_stderr": 0.015595520294147411,
"acc_norm": 0.3195530726256983,
"acc_norm_stderr": 0.015595520294147411
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.027363593284684965,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.027363593284684965
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6591639871382636,
"acc_stderr": 0.02692084126077616,
"acc_norm": 0.6591639871382636,
"acc_norm_stderr": 0.02692084126077616
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.02640614597362568,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.02640614597362568
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41843971631205673,
"acc_stderr": 0.02942799403941999,
"acc_norm": 0.41843971631205673,
"acc_norm_stderr": 0.02942799403941999
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4276401564537158,
"acc_stderr": 0.012635799922765848,
"acc_norm": 0.4276401564537158,
"acc_norm_stderr": 0.012635799922765848
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5073529411764706,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.5073529411764706,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.019977422600227477,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.019977422600227477
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425464,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425464
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6693877551020408,
"acc_stderr": 0.030116426296540603,
"acc_norm": 0.6693877551020408,
"acc_norm_stderr": 0.030116426296540603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7810945273631841,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.7810945273631841,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.031885780176863984,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.031885780176863984
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37454100367197063,
"mc1_stderr": 0.01694353512840533,
"mc2": 0.53603460375601,
"mc2_stderr": 0.015483045221053964
},
"harness|winogrande|5": {
"acc": 0.712707182320442,
"acc_stderr": 0.012717481052478039
},
"harness|gsm8k|5": {
"acc": 0.6004548900682335,
"acc_stderr": 0.01349166029881599
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
pgurazada1/entities-laptop | ---
license: apache-2.0
task_categories:
- text-classification
---
Dataset that can be used to fine-tune models to extract entities in a specific format. |
liuyanchen1015/MULTI_VALUE_qqp_adj_postfix | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 2856162
num_examples: 16756
- name: test
num_bytes: 28114750
num_examples: 166388
- name: train
num_bytes: 25519781
num_examples: 149488
download_size: 35380273
dataset_size: 56490693
---
# Dataset Card for "MULTI_VALUE_qqp_adj_postfix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kostayli/ru-WikiSQL-25k | ---
task_categories:
- text2text-generation
language:
- ru
pretty_name: wikisql-ru-low
size_categories:
- 10K<n<100K
--- |
cgato/SlimOrcaDedupCleaned | ---
license: mit
---
### What is this dataset?
Half of the Slim Orca Deduped dataset, but further cleaned by removing instances of soft prompting.
I removed a ton prompt prefixes which did not add any information or were redundant. Ex. "Question:", "Q:", "Write the Answer:", "Read this:", "Instructions:"
I also removed a ton of prompt suffixes which were simply there to lead the model to answer as expected Ex. "The answer is...", "Answer:", "A:", "Summary:", "Output:", "Highlight:"
### Why?
I cleaned this dataset up because a lot of the prompt prefixes were just wasted tokens the model had to process.
Additionally, they were repeated over thousands of prompts which could lead the model to overtrain.
For the prompt suffixes, these were cleaned because they leaned too hard on the base models original completion behavior in addition to being very repetitive. |
msubhasish28/reuters_articles | ---
dataset_info:
features:
- name: title
dtype: string
- name: body
dtype: string
splits:
- name: train
num_bytes: 13792576
num_examples: 17262
- name: validation
num_bytes: 1870389
num_examples: 2158
- name: test
num_bytes: 1379190
num_examples: 2158
download_size: 10073414
dataset_size: 17042155
---
# Dataset Card for "reuters_articles"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/laffey_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of laffey/ラフィー/拉菲 (Azur Lane)
This is the dataset of laffey/ラフィー/拉菲 (Azur Lane), containing 500 images and their tags.
The core tags of this character are `rabbit_ears, animal_ears, long_hair, twintails, bangs, red_eyes, hair_between_eyes, white_hair, hairband, very_long_hair, fake_animal_ears, hair_ornament, red_hairband`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 709.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/laffey_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 384.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/laffey_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1310 | 869.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/laffey_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 616.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/laffey_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1310 | 1.24 GiB | [Download](https://huggingface.co/datasets/CyberHarem/laffey_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/laffey_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, bare_shoulders, blue_flower, hair_flower, holding_bouquet, looking_at_viewer, official_alternate_costume, solo, wedding_dress, white_dress, blush, closed_mouth, smile, white_gloves, bow, ribbon, simple_background, white_flower |
| 1 | 10 |  |  |  |  |  | 1girl, long_sleeves, off_shoulder, pink_jacket, simple_background, solo, upper_body, white_background, collarbone, bare_shoulders, blush, open_jacket, looking_at_viewer, strap_slip, white_camisole, closed_mouth, parted_lips, sleeves_past_wrists |
| 2 | 5 |  |  |  |  |  | 1girl, long_sleeves, looking_at_viewer, off_shoulder, open_jacket, pink_jacket, pleated_skirt, red_skirt, solo, bare_shoulders, blush, strap_slip, white_background, white_camisole, cleavage, collarbone, simple_background, closed_mouth, cowboy_shot, grey_hair, sitting, sleeves_past_wrists, small_breasts |
| 3 | 12 |  |  |  |  |  | 1girl, blush, long_sleeves, looking_at_viewer, off_shoulder, pink_jacket, pleated_skirt, red_skirt, solo, white_camisole, bare_shoulders, collarbone, open_jacket, white_thighhighs, parted_lips, simple_background, white_background, sleeves_past_wrists, strap_slip, fur_trim, sitting |
| 4 | 9 |  |  |  |  |  | 1girl, bikini_top_only, blush, looking_at_viewer, navel, pleated_skirt, retrofit_(azur_lane), solo, white_bikini, white_skirt, white_thighhighs, black_hairband, open_jacket, small_breasts, bare_shoulders, belt, buckle, collarbone, long_sleeves, miniskirt, :o, black_jacket, sidelocks, stomach, parted_lips |
| 5 | 17 |  |  |  |  |  | hair_bow, long_sleeves, looking_at_viewer, low_twintails, 1girl, blush, solo, hanfu, red_bow, red_dress, parted_lips, collarbone, holding, jingle_bell, :o, sitting, white_background, frills, see-through, pink_dress, shawl, simple_background, wide_sleeves |
| 6 | 28 |  |  |  |  |  | obi, blush, long_sleeves, 1girl, floral_print, red_bow, solo, wide_sleeves, hair_bow, looking_at_viewer, print_kimono, sidelocks, double_bun, holding_food, candy_apple, parted_lips, blue_kimono, purple_kimono, white_background |
| 7 | 23 |  |  |  |  |  | 1girl, looking_at_viewer, solo, blush, blue_skirt, pleated_skirt, beret, blue_headwear, blue_shirt, midriff, white_sailor_collar, bare_shoulders, sidelocks, white_background, white_thighhighs, wrist_cuffs, simple_background, detached_sleeves, hair_bow, navel, red_bow, yellow_bow, parted_lips, blue_serafuku, bowtie, crop_top, blue_choker, puffy_short_sleeves, sleeveless_shirt, zettai_ryouiki, blue_sleeves |
| 8 | 31 |  |  |  |  |  | looking_at_viewer, 1girl, bare_shoulders, solo, midriff, detached_sleeves, hair_bow, long_sleeves, navel, plaid_skirt, pleated_skirt, white_pantyhose, headset, collarbone, blush, pink_skirt, crop_top, sidelocks, parted_lips, black_choker, frills, shirt, small_breasts |
| 9 | 12 |  |  |  |  |  | 1girl, looking_at_viewer, playboy_bunny, solo, small_breasts, white_pantyhose, cup, full_body, strapless_leotard, blush, hair_ribbon, official_alternate_costume, blue_leotard, covered_navel, holding_tray, no_shoes |
| 10 | 5 |  |  |  |  |  | 2girls, blush, collarbone, bare_arms, bare_shoulders, navel, parted_lips, small_breasts, solo_focus, heart, looking_at_viewer, white_bikini, 1girl, bow, grey_hair, groin, halterneck, retrofit_(azur_lane) |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | blue_flower | hair_flower | holding_bouquet | looking_at_viewer | official_alternate_costume | solo | wedding_dress | white_dress | blush | closed_mouth | smile | white_gloves | bow | ribbon | simple_background | white_flower | long_sleeves | off_shoulder | pink_jacket | upper_body | white_background | collarbone | open_jacket | strap_slip | white_camisole | parted_lips | sleeves_past_wrists | pleated_skirt | red_skirt | cleavage | cowboy_shot | grey_hair | sitting | small_breasts | white_thighhighs | fur_trim | bikini_top_only | navel | retrofit_(azur_lane) | white_bikini | white_skirt | black_hairband | belt | buckle | miniskirt | :o | black_jacket | sidelocks | stomach | hair_bow | low_twintails | hanfu | red_bow | red_dress | holding | jingle_bell | frills | see-through | pink_dress | shawl | wide_sleeves | obi | floral_print | print_kimono | double_bun | holding_food | candy_apple | blue_kimono | purple_kimono | blue_skirt | beret | blue_headwear | blue_shirt | midriff | white_sailor_collar | wrist_cuffs | detached_sleeves | yellow_bow | blue_serafuku | bowtie | crop_top | blue_choker | puffy_short_sleeves | sleeveless_shirt | zettai_ryouiki | blue_sleeves | plaid_skirt | white_pantyhose | headset | pink_skirt | black_choker | shirt | playboy_bunny | cup | full_body | strapless_leotard | hair_ribbon | blue_leotard | covered_navel | holding_tray | no_shoes | 2girls | bare_arms | solo_focus | heart | groin | halterneck |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-----------------|:--------------|:--------------|:------------------|:--------------------|:-----------------------------|:-------|:----------------|:--------------|:--------|:---------------|:--------|:---------------|:------|:---------|:--------------------|:---------------|:---------------|:---------------|:--------------|:-------------|:-------------------|:-------------|:--------------|:-------------|:-----------------|:--------------|:----------------------|:----------------|:------------|:-----------|:--------------|:------------|:----------|:----------------|:-------------------|:-----------|:------------------|:--------|:-----------------------|:---------------|:--------------|:-----------------|:-------|:---------|:------------|:-----|:---------------|:------------|:----------|:-----------|:----------------|:--------|:----------|:------------|:----------|:--------------|:---------|:--------------|:-------------|:--------|:---------------|:------|:---------------|:---------------|:-------------|:---------------|:--------------|:--------------|:----------------|:-------------|:--------|:----------------|:-------------|:----------|:----------------------|:--------------|:-------------------|:-------------|:----------------|:---------|:-----------|:--------------|:----------------------|:-------------------|:-----------------|:---------------|:--------------|:------------------|:----------|:-------------|:---------------|:--------|:----------------|:------|:------------|:--------------------|:--------------|:---------------|:----------------|:---------------|:-----------|:---------|:------------|:-------------|:--------|:--------|:-------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | | | | X | | X | | | X | X | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | | | | X | | X | | | X | X | | | | | X | | X | X | X | | X | X | X | X | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 12 |  |  |  |  |  | X | X | | | | X | | X | | | X | | | | | | X | | X | X | X | | X | X | X | X | X | X | X | X | X | | | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | X | | | | X | | X | | | X | | | | | | | | X | | | | | X | X | | | X | | X | | | | | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 17 |  |  |  |  |  | X | | | | | X | | X | | | X | | | | | | X | | X | | | | X | X | | | | X | | | | | | | X | | | | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 28 |  |  |  |  |  | X | | | | | X | | X | | | X | | | | | | | | X | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | | X | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 23 |  |  |  |  |  | X | X | | | | X | | X | | | X | | | | | | X | | | | | | X | | | | | X | | X | | | | | | | X | | | X | | | | | | | | | | X | | X | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 8 | 31 |  |  |  |  |  | X | X | | | | X | | X | | | X | | | | | | | | X | | | | | X | | | | X | | X | | | | | | X | | | | X | | | | | | | | | | X | | X | | | | | | | X | | | | | | | | | | | | | | | | | X | | | X | | | | X | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 9 | 12 |  |  |  |  |  | X | | | | | X | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | | | | | | |
| 10 | 5 |  |  |  |  |  | X | X | | | | X | | | | | X | | | | X | | | | | | | | | X | | | | X | | | | | | X | | X | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X |
|
pyong/llamaPrompts | ---
license: apache-2.0
--- |
joseloncon/Ejemplo.mini-coupier | ---
license: apache-2.0
---
|
linhphanff/phobert-vietnamse-nomic-embed-mlm | ---
license: apache-2.0
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: special_tokens_mask
sequence: int8
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 15014344800
num_examples: 1046150
download_size: 4075336926
dataset_size: 15014344800
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mozilla-foundation/common_voice_11_0 | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
license:
- cc0-1.0
multilinguality:
- multilingual
size_categories:
ab:
- 10K<n<100K
ar:
- 100K<n<1M
as:
- 1K<n<10K
ast:
- n<1K
az:
- n<1K
ba:
- 100K<n<1M
bas:
- 1K<n<10K
be:
- 100K<n<1M
bg:
- 1K<n<10K
bn:
- 100K<n<1M
br:
- 10K<n<100K
ca:
- 1M<n<10M
ckb:
- 100K<n<1M
cnh:
- 1K<n<10K
cs:
- 10K<n<100K
cv:
- 10K<n<100K
cy:
- 100K<n<1M
da:
- 1K<n<10K
de:
- 100K<n<1M
dv:
- 10K<n<100K
el:
- 10K<n<100K
en:
- 1M<n<10M
eo:
- 1M<n<10M
es:
- 1M<n<10M
et:
- 10K<n<100K
eu:
- 100K<n<1M
fa:
- 100K<n<1M
fi:
- 10K<n<100K
fr:
- 100K<n<1M
fy-NL:
- 10K<n<100K
ga-IE:
- 1K<n<10K
gl:
- 10K<n<100K
gn:
- 1K<n<10K
ha:
- 1K<n<10K
hi:
- 10K<n<100K
hsb:
- 1K<n<10K
hu:
- 10K<n<100K
hy-AM:
- 1K<n<10K
ia:
- 10K<n<100K
id:
- 10K<n<100K
ig:
- 1K<n<10K
it:
- 100K<n<1M
ja:
- 10K<n<100K
ka:
- 10K<n<100K
kab:
- 100K<n<1M
kk:
- 1K<n<10K
kmr:
- 10K<n<100K
ky:
- 10K<n<100K
lg:
- 100K<n<1M
lt:
- 10K<n<100K
lv:
- 1K<n<10K
mdf:
- n<1K
mhr:
- 100K<n<1M
mk:
- n<1K
ml:
- 1K<n<10K
mn:
- 10K<n<100K
mr:
- 10K<n<100K
mrj:
- 10K<n<100K
mt:
- 10K<n<100K
myv:
- 1K<n<10K
nan-tw:
- 10K<n<100K
ne-NP:
- n<1K
nl:
- 10K<n<100K
nn-NO:
- n<1K
or:
- 1K<n<10K
pa-IN:
- 1K<n<10K
pl:
- 100K<n<1M
pt:
- 100K<n<1M
rm-sursilv:
- 1K<n<10K
rm-vallader:
- 1K<n<10K
ro:
- 10K<n<100K
ru:
- 100K<n<1M
rw:
- 1M<n<10M
sah:
- 1K<n<10K
sat:
- n<1K
sc:
- 1K<n<10K
sk:
- 10K<n<100K
skr:
- 1K<n<10K
sl:
- 10K<n<100K
sr:
- 1K<n<10K
sv-SE:
- 10K<n<100K
sw:
- 100K<n<1M
ta:
- 100K<n<1M
th:
- 100K<n<1M
ti:
- n<1K
tig:
- n<1K
tok:
- 1K<n<10K
tr:
- 10K<n<100K
tt:
- 10K<n<100K
tw:
- n<1K
ug:
- 10K<n<100K
uk:
- 10K<n<100K
ur:
- 100K<n<1M
uz:
- 100K<n<1M
vi:
- 10K<n<100K
vot:
- n<1K
yue:
- 10K<n<100K
zh-CN:
- 100K<n<1M
zh-HK:
- 100K<n<1M
zh-TW:
- 100K<n<1M
source_datasets:
- extended|common_voice
task_categories:
- automatic-speech-recognition
task_ids: []
paperswithcode_id: common-voice
pretty_name: Common Voice Corpus 11.0
language_bcp47:
- ab
- ar
- as
- ast
- az
- ba
- bas
- be
- bg
- bn
- br
- ca
- ckb
- cnh
- cs
- cv
- cy
- da
- de
- dv
- el
- en
- eo
- es
- et
- eu
- fa
- fi
- fr
- fy-NL
- ga-IE
- gl
- gn
- ha
- hi
- hsb
- hu
- hy-AM
- ia
- id
- ig
- it
- ja
- ka
- kab
- kk
- kmr
- ky
- lg
- lt
- lv
- mdf
- mhr
- mk
- ml
- mn
- mr
- mrj
- mt
- myv
- nan-tw
- ne-NP
- nl
- nn-NO
- or
- pa-IN
- pl
- pt
- rm-sursilv
- rm-vallader
- ro
- ru
- rw
- sah
- sat
- sc
- sk
- skr
- sl
- sr
- sv-SE
- sw
- ta
- th
- ti
- tig
- tok
- tr
- tt
- tw
- ug
- uk
- ur
- uz
- vi
- vot
- yue
- zh-CN
- zh-HK
- zh-TW
extra_gated_prompt: By clicking on “Access repository” below, you also agree to not
attempt to determine the identity of speakers in the Common Voice dataset.
---
# Dataset Card for Common Voice Corpus 11.0
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://commonvoice.mozilla.org/en/datasets
- **Repository:** https://github.com/common-voice/common-voice
- **Paper:** https://arxiv.org/abs/1912.06670
- **Leaderboard:** https://paperswithcode.com/dataset/common-voice
- **Point of Contact:** [Anton Lozhkov](mailto:anton@huggingface.co)
### Dataset Summary
The Common Voice dataset consists of a unique MP3 and corresponding text file.
Many of the 24210 recorded hours in the dataset also include demographic metadata like age, sex, and accent
that can help improve the accuracy of speech recognition engines.
The dataset currently consists of 16413 validated hours in 100 languages, but more voices and languages are always added.
Take a look at the [Languages](https://commonvoice.mozilla.org/en/languages) page to request a language or start contributing.
### Supported Tasks and Leaderboards
The results for models trained on the Common Voice datasets are available via the
[🤗 Autoevaluate Leaderboard](https://huggingface.co/spaces/autoevaluate/leaderboards?dataset=mozilla-foundation%2Fcommon_voice_11_0&only_verified=0&task=automatic-speech-recognition&config=ar&split=test&metric=wer)
### Languages
```
Abkhaz, Arabic, Armenian, Assamese, Asturian, Azerbaijani, Basaa, Bashkir, Basque, Belarusian, Bengali, Breton, Bulgarian, Cantonese, Catalan, Central Kurdish, Chinese (China), Chinese (Hong Kong), Chinese (Taiwan), Chuvash, Czech, Danish, Dhivehi, Dutch, English, Erzya, Esperanto, Estonian, Finnish, French, Frisian, Galician, Georgian, German, Greek, Guarani, Hakha Chin, Hausa, Hill Mari, Hindi, Hungarian, Igbo, Indonesian, Interlingua, Irish, Italian, Japanese, Kabyle, Kazakh, Kinyarwanda, Kurmanji Kurdish, Kyrgyz, Latvian, Lithuanian, Luganda, Macedonian, Malayalam, Maltese, Marathi, Meadow Mari, Moksha, Mongolian, Nepali, Norwegian Nynorsk, Odia, Persian, Polish, Portuguese, Punjabi, Romanian, Romansh Sursilvan, Romansh Vallader, Russian, Sakha, Santali (Ol Chiki), Saraiki, Sardinian, Serbian, Slovak, Slovenian, Sorbian, Upper, Spanish, Swahili, Swedish, Taiwanese (Minnan), Tamil, Tatar, Thai, Tigre, Tigrinya, Toki Pona, Turkish, Twi, Ukrainian, Urdu, Uyghur, Uzbek, Vietnamese, Votic, Welsh
```
## How to use
The `datasets` library allows you to load and pre-process your dataset in pure Python, at scale. The dataset can be downloaded and prepared in one call to your local drive by using the `load_dataset` function.
For example, to download the Hindi config, simply specify the corresponding language config name (i.e., "hi" for Hindi):
```python
from datasets import load_dataset
cv_11 = load_dataset("mozilla-foundation/common_voice_11_0", "hi", split="train")
```
Using the datasets library, you can also stream the dataset on-the-fly by adding a `streaming=True` argument to the `load_dataset` function call. Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire dataset to disk.
```python
from datasets import load_dataset
cv_11 = load_dataset("mozilla-foundation/common_voice_11_0", "hi", split="train", streaming=True)
print(next(iter(cv_11)))
```
*Bonus*: create a [PyTorch dataloader](https://huggingface.co/docs/datasets/use_with_pytorch) directly with your own datasets (local/streamed).
### Local
```python
from datasets import load_dataset
from torch.utils.data.sampler import BatchSampler, RandomSampler
cv_11 = load_dataset("mozilla-foundation/common_voice_11_0", "hi", split="train")
batch_sampler = BatchSampler(RandomSampler(cv_11), batch_size=32, drop_last=False)
dataloader = DataLoader(cv_11, batch_sampler=batch_sampler)
```
### Streaming
```python
from datasets import load_dataset
from torch.utils.data import DataLoader
cv_11 = load_dataset("mozilla-foundation/common_voice_11_0", "hi", split="train")
dataloader = DataLoader(cv_11, batch_size=32)
```
To find out more about loading and preparing audio datasets, head over to [hf.co/blog/audio-datasets](https://huggingface.co/blog/audio-datasets).
### Example scripts
Train your own CTC or Seq2Seq Automatic Speech Recognition models on Common Voice 11 with `transformers` - [here](https://github.com/huggingface/transformers/tree/main/examples/pytorch/speech-recognition).
## Dataset Structure
### Data Instances
A typical data point comprises the `path` to the audio file and its `sentence`.
Additional fields include `accent`, `age`, `client_id`, `up_votes`, `down_votes`, `gender`, `locale` and `segment`.
```python
{
'client_id': 'd59478fbc1ee646a28a3c652a119379939123784d99131b865a89f8b21c81f69276c48bd574b81267d9d1a77b83b43e6d475a6cfc79c232ddbca946ae9c7afc5',
'path': 'et/clips/common_voice_et_18318995.mp3',
'audio': {
'path': 'et/clips/common_voice_et_18318995.mp3',
'array': array([-0.00048828, -0.00018311, -0.00137329, ..., 0.00079346, 0.00091553, 0.00085449], dtype=float32),
'sampling_rate': 48000
},
'sentence': 'Tasub kokku saada inimestega, keda tunned juba ammust ajast saati.',
'up_votes': 2,
'down_votes': 0,
'age': 'twenties',
'gender': 'male',
'accent': '',
'locale': 'et',
'segment': ''
}
```
### Data Fields
`client_id` (`string`): An id for which client (voice) made the recording
`path` (`string`): The path to the audio file
`audio` (`dict`): A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`.
`sentence` (`string`): The sentence the user was prompted to speak
`up_votes` (`int64`): How many upvotes the audio file has received from reviewers
`down_votes` (`int64`): How many downvotes the audio file has received from reviewers
`age` (`string`): The age of the speaker (e.g. `teens`, `twenties`, `fifties`)
`gender` (`string`): The gender of the speaker
`accent` (`string`): Accent of the speaker
`locale` (`string`): The locale of the speaker
`segment` (`string`): Usually an empty field
### Data Splits
The speech material has been subdivided into portions for dev, train, test, validated, invalidated, reported and other.
The validated data is data that has been validated with reviewers and received upvotes that the data is of high quality.
The invalidated data is data has been invalidated by reviewers
and received downvotes indicating that the data is of low quality.
The reported data is data that has been reported, for different reasons.
The other data is data that has not yet been reviewed.
The dev, test, train are all data that has been reviewed, deemed of high quality and split into dev, test and train.
## Data Preprocessing Recommended by Hugging Face
The following are data preprocessing steps advised by the Hugging Face team. They are accompanied by an example code snippet that shows how to put them to practice.
Many examples in this dataset have trailing quotations marks, e.g _“the cat sat on the mat.“_. These trailing quotation marks do not change the actual meaning of the sentence, and it is near impossible to infer whether a sentence is a quotation or not a quotation from audio data alone. In these cases, it is advised to strip the quotation marks, leaving: _the cat sat on the mat_.
In addition, the majority of training sentences end in punctuation ( . or ? or ! ), whereas just a small proportion do not. In the dev set, **almost all** sentences end in punctuation. Thus, it is recommended to append a full-stop ( . ) to the end of the small number of training examples that do not end in punctuation.
```python
from datasets import load_dataset
ds = load_dataset("mozilla-foundation/common_voice_11_0", "en", use_auth_token=True)
def prepare_dataset(batch):
"""Function to preprocess the dataset with the .map method"""
transcription = batch["sentence"]
if transcription.startswith('"') and transcription.endswith('"'):
# we can remove trailing quotation marks as they do not affect the transcription
transcription = transcription[1:-1]
if transcription[-1] not in [".", "?", "!"]:
# append a full-stop to sentences that do not end in punctuation
transcription = transcription + "."
batch["sentence"] = transcription
return batch
ds = ds.map(prepare_dataset, desc="preprocess dataset")
```
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.
## Considerations for Using the Data
### Social Impact of Dataset
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Public Domain, [CC-0](https://creativecommons.org/share-your-work/public-domain/cc0/)
### Citation Information
```
@inproceedings{commonvoice:2020,
author = {Ardila, R. and Branson, M. and Davis, K. and Henretty, M. and Kohler, M. and Meyer, J. and Morais, R. and Saunders, L. and Tyers, F. M. and Weber, G.},
title = {Common Voice: A Massively-Multilingual Speech Corpus},
booktitle = {Proceedings of the 12th Conference on Language Resources and Evaluation (LREC 2020)},
pages = {4211--4215},
year = 2020
}
```
|
zolak/twitter_dataset_78_1713064204 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2812169
num_examples: 6899
download_size: 1428555
dataset_size: 2812169
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
cognizedeepak/CognizeDeepak | ---
license: other
---
|
hails/agieval-logiqa-en | ---
dataset_info:
features:
- name: query
dtype: string
- name: choices
sequence: string
- name: gold
sequence: int64
splits:
- name: test
num_bytes: 852087
num_examples: 651
download_size: 420355
dataset_size: 852087
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
language:
- en
---
# Dataset Card for "agieval-logiqa-en"
Dataset taken from https://github.com/microsoft/AGIEval and processed as in that repo, following dmayhem93/agieval-* datasets on the HF hub.
This dataset contains the contents of the LogiQA English subtask of AGIEval, as accessed in https://github.com/ruixiangcui/AGIEval/commit/5c77d073fda993f1652eaae3cf5d04cc5fd21d40 .
Citation:
```
@misc{zhong2023agieval,
title={AGIEval: A Human-Centric Benchmark for Evaluating Foundation Models},
author={Wanjun Zhong and Ruixiang Cui and Yiduo Guo and Yaobo Liang and Shuai Lu and Yanlin Wang and Amin Saied and Weizhu Chen and Nan Duan},
year={2023},
eprint={2304.06364},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
Please make sure to cite all the individual datasets in your paper when you use them. We provide the relevant citation information below:
```
@inproceedings{ling-etal-2017-program,
title = "Program Induction by Rationale Generation: Learning to Solve and Explain Algebraic Word Problems",
author = "Ling, Wang and
Yogatama, Dani and
Dyer, Chris and
Blunsom, Phil",
booktitle = "Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
month = jul,
year = "2017",
address = "Vancouver, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/P17-1015",
doi = "10.18653/v1/P17-1015",
pages = "158--167",
abstract = "Solving algebraic word problems requires executing a series of arithmetic operations{---}a program{---}to obtain a final answer. However, since programs can be arbitrarily complicated, inducing them directly from question-answer pairs is a formidable challenge. To make this task more feasible, we solve these problems by generating answer rationales, sequences of natural language and human-readable mathematical expressions that derive the final answer through a series of small steps. Although rationales do not explicitly specify programs, they provide a scaffolding for their structure via intermediate milestones. To evaluate our approach, we have created a new 100,000-sample dataset of questions, answers and rationales. Experimental results show that indirect supervision of program learning via answer rationales is a promising strategy for inducing arithmetic programs.",
}
@inproceedings{hendrycksmath2021,
title={Measuring Mathematical Problem Solving With the MATH Dataset},
author={Dan Hendrycks and Collin Burns and Saurav Kadavath and Akul Arora and Steven Basart and Eric Tang and Dawn Song and Jacob Steinhardt},
journal={NeurIPS},
year={2021}
}
@inproceedings{Liu2020LogiQAAC,
title={LogiQA: A Challenge Dataset for Machine Reading Comprehension with Logical Reasoning},
author={Jian Liu and Leyang Cui and Hanmeng Liu and Dandan Huang and Yile Wang and Yue Zhang},
booktitle={International Joint Conference on Artificial Intelligence},
year={2020}
}
@inproceedings{zhong2019jec,
title={JEC-QA: A Legal-Domain Question Answering Dataset},
author={Zhong, Haoxi and Xiao, Chaojun and Tu, Cunchao and Zhang, Tianyang and Liu, Zhiyuan and Sun, Maosong},
booktitle={Proceedings of AAAI},
year={2020},
}
@article{Wang2021FromLT,
title={From LSAT: The Progress and Challenges of Complex Reasoning},
author={Siyuan Wang and Zhongkun Liu and Wanjun Zhong and Ming Zhou and Zhongyu Wei and Zhumin Chen and Nan Duan},
journal={IEEE/ACM Transactions on Audio, Speech, and Language Processing},
year={2021},
volume={30},
pages={2201-2216}
}
``` |
chrislee973/whales-stft | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
splits:
- name: train
num_bytes: 1994137410.292
num_examples: 29999
download_size: 1794160159
dataset_size: 1994137410.292
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_TheBloke__Planner-7B-fp16 | ---
pretty_name: Evaluation run of TheBloke/Planner-7B-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/Planner-7B-fp16](https://huggingface.co/TheBloke/Planner-7B-fp16) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Planner-7B-fp16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-21T22:53:17.425716](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Planner-7B-fp16/blob/main/results_2023-10-21T22-53-17.425716.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n\
\ \"em_stderr\": 0.0003314581465219126,\n \"f1\": 0.056186031879194784,\n\
\ \"f1_stderr\": 0.0012858243614759428,\n \"acc\": 0.3749593848153363,\n\
\ \"acc_stderr\": 0.008901319861891403\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.0003314581465219126,\n\
\ \"f1\": 0.056186031879194784,\n \"f1_stderr\": 0.0012858243614759428\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0356330553449583,\n \
\ \"acc_stderr\": 0.00510610785374419\n },\n \"harness|winogrande|5\":\
\ {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.012696531870038616\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheBloke/Planner-7B-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_21T22_53_17.425716
path:
- '**/details_harness|drop|3_2023-10-21T22-53-17.425716.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-21T22-53-17.425716.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_21T22_53_17.425716
path:
- '**/details_harness|gsm8k|5_2023-10-21T22-53-17.425716.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-21T22-53-17.425716.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:47:15.541190.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:47:15.541190.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:47:15.541190.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_21T22_53_17.425716
path:
- '**/details_harness|winogrande|5_2023-10-21T22-53-17.425716.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-21T22-53-17.425716.parquet'
- config_name: results
data_files:
- split: 2023_07_19T16_47_15.541190
path:
- results_2023-07-19T16:47:15.541190.parquet
- split: 2023_10_21T22_53_17.425716
path:
- results_2023-10-21T22-53-17.425716.parquet
- split: latest
path:
- results_2023-10-21T22-53-17.425716.parquet
---
# Dataset Card for Evaluation run of TheBloke/Planner-7B-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Planner-7B-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Planner-7B-fp16](https://huggingface.co/TheBloke/Planner-7B-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Planner-7B-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-21T22:53:17.425716](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Planner-7B-fp16/blob/main/results_2023-10-21T22-53-17.425716.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0010486577181208054,
"em_stderr": 0.0003314581465219126,
"f1": 0.056186031879194784,
"f1_stderr": 0.0012858243614759428,
"acc": 0.3749593848153363,
"acc_stderr": 0.008901319861891403
},
"harness|drop|3": {
"em": 0.0010486577181208054,
"em_stderr": 0.0003314581465219126,
"f1": 0.056186031879194784,
"f1_stderr": 0.0012858243614759428
},
"harness|gsm8k|5": {
"acc": 0.0356330553449583,
"acc_stderr": 0.00510610785374419
},
"harness|winogrande|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.012696531870038616
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
tomaszki/classification_100k | ---
language:
- en
dataset_info:
features:
- name: text
dtype: string
- name: text_label
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 92835478
num_examples: 100000
download_size: 61482556
dataset_size: 92835478
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ohtaman/oscar_ja_clean_filtered | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: meta
struct:
- name: warc_headers
struct:
- name: warc-record-id
dtype: string
- name: warc-date
dtype: string
- name: content-type
dtype: string
- name: content-length
dtype: int32
- name: warc-type
dtype: string
- name: warc-identified-content-language
dtype: string
- name: warc-refers-to
dtype: string
- name: warc-target-uri
dtype: string
- name: warc-block-digest
dtype: string
- name: identification
struct:
- name: label
dtype: string
- name: prob
dtype: float32
- name: harmful_pp
dtype: float32
- name: tlsh
dtype: string
- name: quality_warnings
sequence: string
- name: categories
sequence: string
- name: sentence_identifications
list:
- name: label
dtype: string
- name: prob
dtype: float32
- name: kenlm_tatoeba
dtype: float64
- name: kenlm_aozora_kids
dtype: float64
splits:
- name: train
num_bytes: 10439287668.360512
num_examples: 4745089
- name: test
num_bytes: 2200019.3607244273
num_examples: 1000
download_size: 7113941574
dataset_size: 10441487687.721235
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
oscar データセットに対して、基本的なクレンジングを施した上で、 tatoeba および青空文庫(新字新仮名の児童向け作品) で学習した kenlm の perprexity でフィルタリングしたものです |
ddrg/math_formula_retrieval | ---
dataset_info:
features:
- name: formula1
dtype: string
- name: formula2
dtype: string
- name: label
dtype: bool
- name: formula1_name_id
dtype: string
splits:
- name: train
num_bytes: 7285320882
num_examples: 21348512
- name: test
num_bytes: 809630657
num_examples: 2372048
download_size: 3656462517
dataset_size: 8094951539
---
# Dataset Card for "math_formula_retrieval"
Mathematical dataset based on 71 famous mathematical identities. Each entry consists of two identities (in formula or textual form), together with a label, whether the two versions describe the same mathematical identity. The false pairs are not randomly chosen, but intentionally hard by modifying equivalent representations (see [ddrg/named_math_formulas](https://huggingface.co/datasets/ddrg/named_math_formulas) for more information). At most 400000 versions are generated per identity. There are ten times more falsified versions than true ones, such that the dataset can be used for a training with changing false examples every epoch. |
open-llm-leaderboard/details_fzzhang__Marcoroni-neural-chat-7B-v2_gsm8k_merged_s | ---
pretty_name: Evaluation run of fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged_s
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged_s](https://huggingface.co/fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged_s)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fzzhang__Marcoroni-neural-chat-7B-v2_gsm8k_merged_s\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-16T23:09:18.709191](https://huggingface.co/datasets/open-llm-leaderboard/details_fzzhang__Marcoroni-neural-chat-7B-v2_gsm8k_merged_s/blob/main/results_2024-02-16T23-09-18.709191.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6322971394538486,\n\
\ \"acc_stderr\": 0.03234149565305396,\n \"acc_norm\": 0.63180337036129,\n\
\ \"acc_norm_stderr\": 0.03300828288156676,\n \"mc1\": 0.4675642594859241,\n\
\ \"mc1_stderr\": 0.017466632149577613,\n \"mc2\": 0.6329203738044532,\n\
\ \"mc2_stderr\": 0.01541374646266871\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6527303754266212,\n \"acc_stderr\": 0.013913034529620444,\n\
\ \"acc_norm\": 0.6715017064846417,\n \"acc_norm_stderr\": 0.013724978465537302\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6754630551682932,\n\
\ \"acc_stderr\": 0.004672447046820005,\n \"acc_norm\": 0.8568014339772954,\n\
\ \"acc_norm_stderr\": 0.003495593662520757\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926605,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926605\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544074,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544074\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.025107425481137282,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.025107425481137282\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n\
\ \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n\
\ \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124484,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124484\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121427,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121427\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \
\ \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815642,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815642\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566548,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566548\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530343,\n \"\
acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530343\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640763,\n \"\
acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640763\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728743,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728743\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.020237149008990925,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.020237149008990925\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n\
\ \"acc_stderr\": 0.013468201614066302,\n \"acc_norm\": 0.8288633461047255,\n\
\ \"acc_norm_stderr\": 0.013468201614066302\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.02410571260775431,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.02410571260775431\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37206703910614525,\n\
\ \"acc_stderr\": 0.016165847583563295,\n \"acc_norm\": 0.37206703910614525,\n\
\ \"acc_norm_stderr\": 0.016165847583563295\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914388992,\n\
\ \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914388992\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.02517104191530968,\n\
\ \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.02517104191530968\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.02977945095730307,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.02977945095730307\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44589308996088656,\n\
\ \"acc_stderr\": 0.012695244711379774,\n \"acc_norm\": 0.44589308996088656,\n\
\ \"acc_norm_stderr\": 0.012695244711379774\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6213235294117647,\n \"acc_stderr\": 0.02946513363977613,\n\
\ \"acc_norm\": 0.6213235294117647,\n \"acc_norm_stderr\": 0.02946513363977613\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6584967320261438,\n \"acc_stderr\": 0.019184639328092487,\n \
\ \"acc_norm\": 0.6584967320261438,\n \"acc_norm_stderr\": 0.019184639328092487\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n\
\ \"acc_stderr\": 0.023729830881018515,\n \"acc_norm\": 0.8706467661691543,\n\
\ \"acc_norm_stderr\": 0.023729830881018515\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4675642594859241,\n\
\ \"mc1_stderr\": 0.017466632149577613,\n \"mc2\": 0.6329203738044532,\n\
\ \"mc2_stderr\": 0.01541374646266871\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7955801104972375,\n \"acc_stderr\": 0.011334090612597207\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6982562547384382,\n \
\ \"acc_stderr\": 0.012643544762873358\n }\n}\n```"
repo_url: https://huggingface.co/fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged_s
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|arc:challenge|25_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|gsm8k|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hellaswag|10_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T23-09-18.709191.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-16T23-09-18.709191.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- '**/details_harness|winogrande|5_2024-02-16T23-09-18.709191.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-16T23-09-18.709191.parquet'
- config_name: results
data_files:
- split: 2024_02_16T23_09_18.709191
path:
- results_2024-02-16T23-09-18.709191.parquet
- split: latest
path:
- results_2024-02-16T23-09-18.709191.parquet
---
# Dataset Card for Evaluation run of fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged_s
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged_s](https://huggingface.co/fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged_s) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fzzhang__Marcoroni-neural-chat-7B-v2_gsm8k_merged_s",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T23:09:18.709191](https://huggingface.co/datasets/open-llm-leaderboard/details_fzzhang__Marcoroni-neural-chat-7B-v2_gsm8k_merged_s/blob/main/results_2024-02-16T23-09-18.709191.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6322971394538486,
"acc_stderr": 0.03234149565305396,
"acc_norm": 0.63180337036129,
"acc_norm_stderr": 0.03300828288156676,
"mc1": 0.4675642594859241,
"mc1_stderr": 0.017466632149577613,
"mc2": 0.6329203738044532,
"mc2_stderr": 0.01541374646266871
},
"harness|arc:challenge|25": {
"acc": 0.6527303754266212,
"acc_stderr": 0.013913034529620444,
"acc_norm": 0.6715017064846417,
"acc_norm_stderr": 0.013724978465537302
},
"harness|hellaswag|10": {
"acc": 0.6754630551682932,
"acc_stderr": 0.004672447046820005,
"acc_norm": 0.8568014339772954,
"acc_norm_stderr": 0.003495593662520757
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926605,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544074,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544074
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.025107425481137282,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.025107425481137282
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124484,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124484
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121427,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121427
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815642,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815642
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566548,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530343,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640763,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640763
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728743,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728743
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.020237149008990925,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.020237149008990925
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066302,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066302
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.02410571260775431,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.02410571260775431
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37206703910614525,
"acc_stderr": 0.016165847583563295,
"acc_norm": 0.37206703910614525,
"acc_norm_stderr": 0.016165847583563295
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914388992,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914388992
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.02517104191530968,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.02517104191530968
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.02977945095730307,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.02977945095730307
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44589308996088656,
"acc_stderr": 0.012695244711379774,
"acc_norm": 0.44589308996088656,
"acc_norm_stderr": 0.012695244711379774
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6213235294117647,
"acc_stderr": 0.02946513363977613,
"acc_norm": 0.6213235294117647,
"acc_norm_stderr": 0.02946513363977613
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6584967320261438,
"acc_stderr": 0.019184639328092487,
"acc_norm": 0.6584967320261438,
"acc_norm_stderr": 0.019184639328092487
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784603,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.023729830881018515,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.023729830881018515
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4675642594859241,
"mc1_stderr": 0.017466632149577613,
"mc2": 0.6329203738044532,
"mc2_stderr": 0.01541374646266871
},
"harness|winogrande|5": {
"acc": 0.7955801104972375,
"acc_stderr": 0.011334090612597207
},
"harness|gsm8k|5": {
"acc": 0.6982562547384382,
"acc_stderr": 0.012643544762873358
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_allknowingroger__LimmyAutomerge-7B-slerp | ---
pretty_name: Evaluation run of allknowingroger/LimmyAutomerge-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [allknowingroger/LimmyAutomerge-7B-slerp](https://huggingface.co/allknowingroger/LimmyAutomerge-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_allknowingroger__LimmyAutomerge-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-11T06:34:56.974523](https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__LimmyAutomerge-7B-slerp/blob/main/results_2024-04-11T06-34-56.974523.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6530195519488943,\n\
\ \"acc_stderr\": 0.032049777506769586,\n \"acc_norm\": 0.6522366684594754,\n\
\ \"acc_norm_stderr\": 0.032722255479873584,\n \"mc1\": 0.620563035495716,\n\
\ \"mc1_stderr\": 0.01698703926614297,\n \"mc2\": 0.77280388116297,\n\
\ \"mc2_stderr\": 0.013838767457894557\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7073378839590444,\n \"acc_stderr\": 0.013295916103619423,\n\
\ \"acc_norm\": 0.7278156996587031,\n \"acc_norm_stderr\": 0.013006600406423702\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.714299940250946,\n\
\ \"acc_stderr\": 0.004508239594503832,\n \"acc_norm\": 0.8904600677155945,\n\
\ \"acc_norm_stderr\": 0.003116771577319422\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971125,\n\
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971125\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659806,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659806\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546836,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546836\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.441340782122905,\n\
\ \"acc_stderr\": 0.016607021781050873,\n \"acc_norm\": 0.441340782122905,\n\
\ \"acc_norm_stderr\": 0.016607021781050873\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.02575586592263295,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.02575586592263295\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"\
acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47392438070404175,\n\
\ \"acc_stderr\": 0.01275285834653313,\n \"acc_norm\": 0.47392438070404175,\n\
\ \"acc_norm_stderr\": 0.01275285834653313\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.620563035495716,\n\
\ \"mc1_stderr\": 0.01698703926614297,\n \"mc2\": 0.77280388116297,\n\
\ \"mc2_stderr\": 0.013838767457894557\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8453038674033149,\n \"acc_stderr\": 0.010163172650433537\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7035633055344959,\n \
\ \"acc_stderr\": 0.012579398235589527\n }\n}\n```"
repo_url: https://huggingface.co/allknowingroger/LimmyAutomerge-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|arc:challenge|25_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|gsm8k|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hellaswag|10_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-11T06-34-56.974523.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-11T06-34-56.974523.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_11T06_34_56.974523
path:
- '**/details_harness|winogrande|5_2024-04-11T06-34-56.974523.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-11T06-34-56.974523.parquet'
- config_name: results
data_files:
- split: 2024_04_11T05_00_33.287775
path:
- results_2024-04-11T05-00-33.287775.parquet
- split: 2024_04_11T06_34_56.974523
path:
- results_2024-04-11T06-34-56.974523.parquet
- split: latest
path:
- results_2024-04-11T06-34-56.974523.parquet
---
# Dataset Card for Evaluation run of allknowingroger/LimmyAutomerge-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [allknowingroger/LimmyAutomerge-7B-slerp](https://huggingface.co/allknowingroger/LimmyAutomerge-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_allknowingroger__LimmyAutomerge-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-11T06:34:56.974523](https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__LimmyAutomerge-7B-slerp/blob/main/results_2024-04-11T06-34-56.974523.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6530195519488943,
"acc_stderr": 0.032049777506769586,
"acc_norm": 0.6522366684594754,
"acc_norm_stderr": 0.032722255479873584,
"mc1": 0.620563035495716,
"mc1_stderr": 0.01698703926614297,
"mc2": 0.77280388116297,
"mc2_stderr": 0.013838767457894557
},
"harness|arc:challenge|25": {
"acc": 0.7073378839590444,
"acc_stderr": 0.013295916103619423,
"acc_norm": 0.7278156996587031,
"acc_norm_stderr": 0.013006600406423702
},
"harness|hellaswag|10": {
"acc": 0.714299940250946,
"acc_stderr": 0.004508239594503832,
"acc_norm": 0.8904600677155945,
"acc_norm_stderr": 0.003116771577319422
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971125,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971125
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659806,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546836,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546836
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.441340782122905,
"acc_stderr": 0.016607021781050873,
"acc_norm": 0.441340782122905,
"acc_norm_stderr": 0.016607021781050873
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.02575586592263295,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.02575586592263295
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47392438070404175,
"acc_stderr": 0.01275285834653313,
"acc_norm": 0.47392438070404175,
"acc_norm_stderr": 0.01275285834653313
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.620563035495716,
"mc1_stderr": 0.01698703926614297,
"mc2": 0.77280388116297,
"mc2_stderr": 0.013838767457894557
},
"harness|winogrande|5": {
"acc": 0.8453038674033149,
"acc_stderr": 0.010163172650433537
},
"harness|gsm8k|5": {
"acc": 0.7035633055344959,
"acc_stderr": 0.012579398235589527
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
saahith/EMSAssist-2 | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: transcript
dtype: string
- name: duration
dtype: float64
splits:
- name: train
num_bytes: 617788659.262
num_examples: 1122
- name: test
num_bytes: 1197091986.0
num_examples: 600
download_size: 1350447521
dataset_size: 1814880645.262
---
# Dataset Card for "EMSAssist-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ani24/linkedinjobprompt | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_JCX-kcuf__Mistral-7B-v0.1-gpt-4-80k | ---
pretty_name: Evaluation run of JCX-kcuf/Mistral-7B-v0.1-gpt-4-80k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [JCX-kcuf/Mistral-7B-v0.1-gpt-4-80k](https://huggingface.co/JCX-kcuf/Mistral-7B-v0.1-gpt-4-80k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JCX-kcuf__Mistral-7B-v0.1-gpt-4-80k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-11T07:48:42.420653](https://huggingface.co/datasets/open-llm-leaderboard/details_JCX-kcuf__Mistral-7B-v0.1-gpt-4-80k/blob/main/results_2024-03-11T07-48-42.420653.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6285216391863163,\n\
\ \"acc_stderr\": 0.032565582440025206,\n \"acc_norm\": 0.6350673512110009,\n\
\ \"acc_norm_stderr\": 0.033216187309971744,\n \"mc1\": 0.3671970624235006,\n\
\ \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5460109817921305,\n\
\ \"mc2_stderr\": 0.01539076861170272\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5947098976109215,\n \"acc_stderr\": 0.014346869060229318,\n\
\ \"acc_norm\": 0.6279863481228669,\n \"acc_norm_stderr\": 0.014124597881844461\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6111332403903604,\n\
\ \"acc_stderr\": 0.004864966792310701,\n \"acc_norm\": 0.8104959171479785,\n\
\ \"acc_norm_stderr\": 0.003911075662883271\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145634,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145634\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.046774730044911984,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.046774730044911984\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n\
\ \"acc_stderr\": 0.024472243840895525,\n \"acc_norm\": 0.7548387096774194,\n\
\ \"acc_norm_stderr\": 0.024472243840895525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790486,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790486\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.024396672985094757,\n\
\ \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.024396672985094757\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6050420168067226,\n \"acc_stderr\": 0.03175367846096626,\n \
\ \"acc_norm\": 0.6050420168067226,\n \"acc_norm_stderr\": 0.03175367846096626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.037101857261199946,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.037101857261199946\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8146788990825689,\n \"acc_stderr\": 0.016659279700295827,\n \"\
acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.016659279700295827\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695063,\n\
\ \"acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695063\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159256,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159256\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917671,\n \"\
acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917671\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.023902325549560406,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.023902325549560406\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n\
\ \"acc_stderr\": 0.013927751372001501,\n \"acc_norm\": 0.8135376756066411,\n\
\ \"acc_norm_stderr\": 0.013927751372001501\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323378,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323378\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33519553072625696,\n\
\ \"acc_stderr\": 0.015788007190185884,\n \"acc_norm\": 0.33519553072625696,\n\
\ \"acc_norm_stderr\": 0.015788007190185884\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875195,\n\
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875195\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829727,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829727\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43546284224250326,\n\
\ \"acc_stderr\": 0.012663412101248332,\n \"acc_norm\": 0.43546284224250326,\n\
\ \"acc_norm_stderr\": 0.012663412101248332\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7022058823529411,\n \"acc_stderr\": 0.027778298701545443,\n\
\ \"acc_norm\": 0.7022058823529411,\n \"acc_norm_stderr\": 0.027778298701545443\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162666,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162666\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505416,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505416\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.028920583220675602,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.028920583220675602\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3671970624235006,\n\
\ \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5460109817921305,\n\
\ \"mc2_stderr\": 0.01539076861170272\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7403314917127072,\n \"acc_stderr\": 0.012322700705552669\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.36315390447308565,\n \
\ \"acc_stderr\": 0.013246614539839862\n }\n}\n```"
repo_url: https://huggingface.co/JCX-kcuf/Mistral-7B-v0.1-gpt-4-80k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|arc:challenge|25_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|gsm8k|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hellaswag|10_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T07-48-42.420653.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T07-48-42.420653.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- '**/details_harness|winogrande|5_2024-03-11T07-48-42.420653.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-11T07-48-42.420653.parquet'
- config_name: results
data_files:
- split: 2024_03_11T07_48_42.420653
path:
- results_2024-03-11T07-48-42.420653.parquet
- split: latest
path:
- results_2024-03-11T07-48-42.420653.parquet
---
# Dataset Card for Evaluation run of JCX-kcuf/Mistral-7B-v0.1-gpt-4-80k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [JCX-kcuf/Mistral-7B-v0.1-gpt-4-80k](https://huggingface.co/JCX-kcuf/Mistral-7B-v0.1-gpt-4-80k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_JCX-kcuf__Mistral-7B-v0.1-gpt-4-80k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-11T07:48:42.420653](https://huggingface.co/datasets/open-llm-leaderboard/details_JCX-kcuf__Mistral-7B-v0.1-gpt-4-80k/blob/main/results_2024-03-11T07-48-42.420653.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6285216391863163,
"acc_stderr": 0.032565582440025206,
"acc_norm": 0.6350673512110009,
"acc_norm_stderr": 0.033216187309971744,
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5460109817921305,
"mc2_stderr": 0.01539076861170272
},
"harness|arc:challenge|25": {
"acc": 0.5947098976109215,
"acc_stderr": 0.014346869060229318,
"acc_norm": 0.6279863481228669,
"acc_norm_stderr": 0.014124597881844461
},
"harness|hellaswag|10": {
"acc": 0.6111332403903604,
"acc_stderr": 0.004864966792310701,
"acc_norm": 0.8104959171479785,
"acc_norm_stderr": 0.003911075662883271
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145634,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145634
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.046774730044911984,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.046774730044911984
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.024472243840895525,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.024472243840895525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.024396672985094757,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.024396672985094757
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6050420168067226,
"acc_stderr": 0.03175367846096626,
"acc_norm": 0.6050420168067226,
"acc_norm_stderr": 0.03175367846096626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.037101857261199946,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.037101857261199946
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8146788990825689,
"acc_stderr": 0.016659279700295827,
"acc_norm": 0.8146788990825689,
"acc_norm_stderr": 0.016659279700295827
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695063,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695063
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159256,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159256
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917671,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560406,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560406
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001501,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001501
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323378,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323378
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33519553072625696,
"acc_stderr": 0.015788007190185884,
"acc_norm": 0.33519553072625696,
"acc_norm_stderr": 0.015788007190185884
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829727,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829727
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43546284224250326,
"acc_stderr": 0.012663412101248332,
"acc_norm": 0.43546284224250326,
"acc_norm_stderr": 0.012663412101248332
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7022058823529411,
"acc_stderr": 0.027778298701545443,
"acc_norm": 0.7022058823529411,
"acc_norm_stderr": 0.027778298701545443
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162666,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162666
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.04582004841505416,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.04582004841505416
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.028920583220675602,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.028920583220675602
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5460109817921305,
"mc2_stderr": 0.01539076861170272
},
"harness|winogrande|5": {
"acc": 0.7403314917127072,
"acc_stderr": 0.012322700705552669
},
"harness|gsm8k|5": {
"acc": 0.36315390447308565,
"acc_stderr": 0.013246614539839862
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
EgilKarlsen/AA_RoBERTa_FT | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 80318765
num_examples: 26057
- name: test
num_bytes: 26774056
num_examples: 8686
download_size: 147154828
dataset_size: 107092821
---
# Dataset Card for "AA_RoBERTa_FT"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
prognosis/cardio-chunks-tokenid | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 23794644
num_examples: 1
download_size: 10557791
dataset_size: 23794644
---
# Dataset Card for "cardio-chunks-tokenid"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Code-Hugger/airfoil-2dsteady | ---
license: apache-2.0
---
|
michaelb1225/open-cm | ---
dataset_info:
features:
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 6129427551.0
num_examples: 671
download_size: 6071742068
dataset_size: 6129427551.0
---
# Dataset Card for "open-cm"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
crumb/KAP4ICL-C4-UL2-15k | ---
dataset_info:
features:
- name: combined_facts_text
dtype: string
- name: raw_text
dtype: string
- name: raw_facts
sequence: string
- name: raw_fact_prompts
sequence: string
- name: raw_topics
sequence: string
- name: raw_topic_prompts
sequence: string
- name: len_text
dtype: int64
- name: num_identifications
dtype: int64
- name: base_topic_count
dtype: int64
- name: len_raw_text
dtype: int64
- name: len_raw_facts
dtype: int64
splits:
- name: train
num_bytes: 82413380
num_examples: 15000
download_size: 47256239
dataset_size: 82413380
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "KAP4ICL-C4-UL2-15k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tfshaman/wrong_metamath_sympy_v1 | ---
dataset_info:
features:
- name: output
dtype: string
- name: answer
dtype: string
- name: question
dtype: string
- name: code_output
dtype: float64
- name: data_type
dtype: string
splits:
- name: train
num_bytes: 98284156
num_examples: 39263
download_size: 34817356
dataset_size: 98284156
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "wrong_metamath_sympy_v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
manishiitg/LDJnr-Capybara | ---
dataset_info:
features:
- name: org_dataset
dtype: string
- name: uniq_id
dtype: string
- name: en_messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: hi_messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 76326810
num_examples: 6710
download_size: 30685522
dataset_size: 76326810
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Nexdata/Spanish_Speech_Data_by_Mobile_Phone | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Spanish_Speech_Data_by_Mobile_Phone
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/951?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
The data volumn is 435 hours and is recorded by 989 Spanish native speakers. The recording text is designed by linguistic experts, which covers general interactive, in-car and home category. The texts are manually proofread with high accuracy. Recording devices are mainstream Android phones and iPhones.
For more details, please refer to the link: https://www.nexdata.ai/datasets/951?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
Spanish
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions |
sethapun/arithmetic_2md_1to100 | ---
dataset_info:
features:
- name: expression
dtype: string
- name: answer
dtype: float64
- name: label
dtype:
class_label:
names:
'0': 'false'
'1': 'true'
splits:
- name: train
num_bytes: 57712
num_examples: 2000
- name: validation
num_bytes: 11550
num_examples: 400
download_size: 29072
dataset_size: 69262
---
# Dataset Card for "arithmetic_2md_1to100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
8osm3rka/azure-docs | ---
license: cc-by-4.0
---
|
ashoksu30/My_test | ---
license: c-uda
---
|
open-llm-leaderboard/details_chavinlo__gpt4-x-alpaca | ---
pretty_name: Evaluation run of chavinlo/gpt4-x-alpaca
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chavinlo/gpt4-x-alpaca](https://huggingface.co/chavinlo/gpt4-x-alpaca) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chavinlo__gpt4-x-alpaca\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T20:56:09.987040](https://huggingface.co/datasets/open-llm-leaderboard/details_chavinlo__gpt4-x-alpaca/blob/main/results_2023-09-22T20-56-09.987040.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.15478187919463088,\n\
\ \"em_stderr\": 0.003704111989193061,\n \"f1\": 0.24988045302013467,\n\
\ \"f1_stderr\": 0.00385619985047934,\n \"acc\": 0.3648545063856345,\n\
\ \"acc_stderr\": 0.008703557271933391\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.15478187919463088,\n \"em_stderr\": 0.003704111989193061,\n\
\ \"f1\": 0.24988045302013467,\n \"f1_stderr\": 0.00385619985047934\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.028051554207733132,\n \
\ \"acc_stderr\": 0.004548229533836362\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7016574585635359,\n \"acc_stderr\": 0.012858885010030421\n\
\ }\n}\n```"
repo_url: https://huggingface.co/chavinlo/gpt4-x-alpaca
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T20_56_09.987040
path:
- '**/details_harness|drop|3_2023-09-22T20-56-09.987040.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T20-56-09.987040.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T20_56_09.987040
path:
- '**/details_harness|gsm8k|5_2023-09-22T20-56-09.987040.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T20-56-09.987040.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T20_56_09.987040
path:
- '**/details_harness|winogrande|5_2023-09-22T20-56-09.987040.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T20-56-09.987040.parquet'
- config_name: results
data_files:
- split: 2023_09_22T20_56_09.987040
path:
- results_2023-09-22T20-56-09.987040.parquet
- split: latest
path:
- results_2023-09-22T20-56-09.987040.parquet
---
# Dataset Card for Evaluation run of chavinlo/gpt4-x-alpaca
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/chavinlo/gpt4-x-alpaca
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [chavinlo/gpt4-x-alpaca](https://huggingface.co/chavinlo/gpt4-x-alpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chavinlo__gpt4-x-alpaca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T20:56:09.987040](https://huggingface.co/datasets/open-llm-leaderboard/details_chavinlo__gpt4-x-alpaca/blob/main/results_2023-09-22T20-56-09.987040.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.15478187919463088,
"em_stderr": 0.003704111989193061,
"f1": 0.24988045302013467,
"f1_stderr": 0.00385619985047934,
"acc": 0.3648545063856345,
"acc_stderr": 0.008703557271933391
},
"harness|drop|3": {
"em": 0.15478187919463088,
"em_stderr": 0.003704111989193061,
"f1": 0.24988045302013467,
"f1_stderr": 0.00385619985047934
},
"harness|gsm8k|5": {
"acc": 0.028051554207733132,
"acc_stderr": 0.004548229533836362
},
"harness|winogrande|5": {
"acc": 0.7016574585635359,
"acc_stderr": 0.012858885010030421
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
bvkbharadwaj/Atharv-ved-kand4 | ---
dataset_info:
features:
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 151870213.0
num_examples: 1
download_size: 114502560
dataset_size: 151870213.0
---
# Dataset Card for "Atharv-ved-kand4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
quirky-lats-at-mats/NORMAL_BACKDOOR_alpaca_sleeper_agents_toy_safety_SFT_v4 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1282425
num_examples: 2828
download_size: 681489
dataset_size: 1282425
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tyzhu/find_first_sent_train_100_eval_20_baseline | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 169972
num_examples: 100
- name: validation
num_bytes: 35584
num_examples: 20
download_size: 158682
dataset_size: 205556
---
# Dataset Card for "find_first_sent_train_100_eval_20_baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/betty_neuralcloud | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of betty/ベティ/贝蒂 (Neural Cloud)
This is the dataset of betty/ベティ/贝蒂 (Neural Cloud), containing 132 images and their tags.
The core tags of this character are `animal_ears, blonde_hair, cat_ears, blue_eyes, long_hair, twintails, hair_ornament, bangs, hair_between_eyes, tail, fang, hairclip, cat_tail, breasts, animal_ear_fluff`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 132 | 148.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/betty_neuralcloud/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 132 | 85.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/betty_neuralcloud/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 310 | 184.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/betty_neuralcloud/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 132 | 131.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/betty_neuralcloud/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 310 | 261.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/betty_neuralcloud/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/betty_neuralcloud',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, fingerless_gloves, headset, collared_shirt, holding_gun, open_mouth, suspender_shorts, white_shirt, black_shorts, sleeves_rolled_up, solo, submachine_gun, looking_at_viewer, short_shorts, knee_pads, striped_necktie, :d, blue_panties, blush, boots, character_name, cowboy_shot |
| 1 | 7 |  |  |  |  |  | 1girl, solo, fingerless_gloves, headset, looking_at_viewer, shorts, boots, knee_pads, necktie, suspenders, open_mouth, panties, :3, full_body, green_gloves, holding_gun, simple_background, sleeves_rolled_up, smile, submachine_gun, white_shirt |
| 2 | 6 |  |  |  |  |  | 1girl, collared_shirt, headset, necktie, solo, white_shirt, looking_at_viewer, simple_background, white_background, suspenders, upper_body, :3, closed_mouth |
| 3 | 11 |  |  |  |  |  | 1girl, elbow_gloves, solo, black_dress, black_gloves, official_alternate_costume, open_mouth, smile, bare_shoulders, choker, small_breasts, collarbone, looking_at_viewer, strapless_dress, sunglasses, simple_background, thigh_strap, :3, gun, tail_ribbon, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | fingerless_gloves | headset | collared_shirt | holding_gun | open_mouth | suspender_shorts | white_shirt | black_shorts | sleeves_rolled_up | solo | submachine_gun | looking_at_viewer | short_shorts | knee_pads | striped_necktie | :d | blue_panties | blush | boots | character_name | cowboy_shot | shorts | necktie | suspenders | panties | :3 | full_body | green_gloves | simple_background | smile | white_background | upper_body | closed_mouth | elbow_gloves | black_dress | black_gloves | official_alternate_costume | bare_shoulders | choker | small_breasts | collarbone | strapless_dress | sunglasses | thigh_strap | gun | tail_ribbon |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:----------|:-----------------|:--------------|:-------------|:-------------------|:--------------|:---------------|:--------------------|:-------|:-----------------|:--------------------|:---------------|:------------|:------------------|:-----|:---------------|:--------|:--------|:-----------------|:--------------|:---------|:----------|:-------------|:----------|:-----|:------------|:---------------|:--------------------|:--------|:-------------------|:-------------|:---------------|:---------------|:--------------|:---------------|:-----------------------------|:-----------------|:---------|:----------------|:-------------|:------------------|:-------------|:--------------|:------|:--------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | | X | X | | X | | X | X | X | X | | X | | | | | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | | X | X | | | | X | | | X | | X | | | | | | | | | | | X | X | | X | | | X | | X | X | X | | | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | X | | | | | X | | | | | X | | X | | | | | | | | | | | | | | X | | | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
rashmi035/dataset_audio_dataset | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
- name: set
dtype: string
splits:
- name: train
num_bytes: 1339597.0
num_examples: 5
- name: validation
num_bytes: 1304849.0
num_examples: 5
- name: test
num_bytes: 1499545.0
num_examples: 5
download_size: 3939356
dataset_size: 4143991.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
# Dataset Card for "dataset_audio_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sarthakpadhi2016/code-llama-spider-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1902503
num_examples: 1000
download_size: 514871
dataset_size: 1902503
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code-llama-spider-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Graphcore/vqa-lxmert | ---
language:
- en
license:
- cc-by-4.0
---
|
yezhengli9/wmt20-ja-en | ---
dataset_info:
features:
- name: id (string)
dtype: string
- name: translation (translation)
dtype: string
splits:
- name: train
num_bytes: 470675
num_examples: 993
download_size: 238951
dataset_size: 470675
---
# Dataset Card for "wmt20-ja-en"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-40000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 661336
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mstz/ionosphere | ---
language:
- en
tags:
- ionosphere
- tabular_classification
- binary_classification
- UCI
pretty_name: Ionosphere
size_categories:
- n<1K
task_categories:
- tabular-classification
configs:
- ionosphere
license: cc
---
# Ionosphere
The [Ionosphere dataset](https://archive.ics.uci.edu/ml/datasets/Ionosphere) from the [UCI ML repository](https://archive.ics.uci.edu/ml/datasets).
Census dataset including personal characteristic of a person, and their ionosphere threshold.
# Configurations and tasks
| **Configuration** | **Task** | **Description** |
|-------------------|---------------------------|---------------------------------------------------------------|
| ionosphere | Binary classification | Does the received signal indicate electrons in the ionosphere?|
# Usage
```python
from datasets import load_dataset
dataset = load_dataset("mstz/ionosphere")["train"]
``` |
alexwww94/SimCLUE | ---
license: other
---
|
Superar/Puntuguese | ---
license: cc-by-sa-4.0
task_categories:
- text-classification
- token-classification
language:
- pt
pretty_name: Puntuguese - A Corpus of Puns in Portuguese with Micro-editions
tags:
- humor
- puns
- humor-recognition
- pun-location
---
# Puntuguese - A Corpus of Puns in Portuguese with Micro-editions
Puntuguese is a corpus of Portuguese punning texts, including Brazilian and European Portuguese jokes. The data has been manually gathered and curated according to our [guidelines](https://github.com/Superar/Puntuguese/blob/main/data/GUIDELINES.md). It also contains some layers of annotation:
- Every pun is classified as homophonic, homographic, both, or none according to their specific punning signs;
- The punning and alternative signs were made explicit for every joke;
- We also mark potentially problematic puns from an ethical perspective, so it is easier to filter them out if needed.
Additionally, every joke in the corpus has a non-humorous counterpart, obtained via micro-editing, to enable Machine Learning systems to be trained.
### Dataset Description
- **Curated by:** [Marcio Lima Inácio](https://eden.dei.uc.pt/~mlinacio/)
- **Funded by:** FCT - Foundation for Science and Technology, I.P. (grant number UI/BD/153496/2022) and the Portuguese Recovery and Resilience Plan (project C645008882-00000055, Center for Responsible AI).
- **Languages:** Brazilian Portuguese; European Portuguese
- **License:** CC-BY-SA-4.0
### Dataset Sources
The puns were collected from three sources: the "Maiores e melhores" web blog, the "O Sagrado Caderno das Piadas Secas" Instagram page, and from the "UTC - Ultimate Trocadilho Challenge" by Castro Brothers on Youtube.
- **Repository:** https://github.com/Superar/Puntuguese
- **Paper:** To be announced
## Dataset Structure
The dataset provided via Hugging Face Hub contains two tasks: humor recognition and pun location. The first task uses the `text` and `label` columns. For pun location, the columns to be used are `tokens` and `labels`. An instance example can be seen below:
```json
{
"id": "1.1.H",
"text": "Deve ser difícil ser professor de natação. Você ensina, ensina, e o aluno nada.",
"label": 1,
"tokens": ["Deve", "ser", "difícil", "ser", "professor", "de", "natação", ".", "Você", "ensina", ",", "ensina", ",", "e", "o", "aluno", "nada", "."],
"labels": [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0]
}
```
## Dataset Creation
#### Data Collection and Processing
The data was manually gathered and curated to ensure that all jokes followed our chosen definition of pun by Miller et al. (2017):
> "A pun is a form of wordplay in which one sign (e.g., a word or phrase) suggests two or more meanings by exploiting polysemy, homonymy, or phonological similarity to another sign, for an intended humorous or rhetorical effect."
Every selected pun must satisfy this definition. Gatherers were also provided some hints for this process:
- A sign can be a single word (or token), a phrase (a sequence of tokens), or a part of a word (a subtoken);
- The humorous effect must rely on the ambiguity of said sign;
- The ambiguity must originate from the word's form (written or spoken);
- Every pun must have a "pun word" (the ambiguous sign that is in the text) and an "alternative word" (the sign's ambiguous interpretation) identified. If it is not possible to identify both, the text is not considered a pun and should not be included.
#### Who are the source data producers?
The original data was produced by professional comedians from the mentioned sources.
## Bias, Risks, and Limitations
As in every real-life scenario, the data can contain problematic and insensitive jokes about delicate subjects. In this sense, we provide in out GitHub repository a list of jokes that the gatherers, personally, thought to be problematic.
## Citation
**BibTeX:**
```
@inproceedings{InacioEtAl2024,
title = {Puntuguese: A Corpus of Puns in {{P}}ortuguese with Micro-editions},
author = {In{\'a}cio, Marcio Lima and {Wick-pedro}, Gabriela and Ramisch, Renata and Esp{\'i}rito Santo, Lu{\'i}s and Chacon, Xiomara S. Q. and Santos, Roney and Sousa, Rog{\'e}rio and Anchi{\^e}ta, Rafael and Gon{\c c}alo Oliveira, Hugo},
year = {2024},
note = {Accepted to LREC-COLING 2024}
}
```
**APA:**
```
Inácio, M. L., Wick-Pedro, G., Ramisch, R., Epírito Santo, L., Chacon, X. S. Q., Santos, R., Sousa, R., Anchiêta, R. & Gonçalo Oliveira, H. (2024). Puntuguese: A Corpus of Puns in {{P}}ortuguese with Micro-editions. Accepted to LREC-COLING 2024.
``` |
soteroshanthi/courses-dataset | ---
license: apache-2.0
---
|
gayanin/kaggle-native-v8 | ---
dataset_info:
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 551013
num_examples: 5140
- name: test
num_bytes: 68382
num_examples: 643
- name: validation
num_bytes: 69979
num_examples: 643
download_size: 270595
dataset_size: 689374
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
tyzhu/lmind_nq_train5000_eval5000_v1_docidx | ---
configs:
- config_name: default
data_files:
- split: train_qa
path: data/train_qa-*
- split: train_recite_qa
path: data/train_recite_qa-*
- split: eval_qa
path: data/eval_qa-*
- split: eval_recite_qa
path: data/eval_recite_qa-*
- split: all_docs
path: data/all_docs-*
- split: all_docs_eval
path: data/all_docs_eval-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: answers
struct:
- name: answer_start
sequence: 'null'
- name: text
sequence: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train_qa
num_bytes: 581636
num_examples: 5000
- name: train_recite_qa
num_bytes: 3790343
num_examples: 5000
- name: eval_qa
num_bytes: 580393
num_examples: 5000
- name: eval_recite_qa
num_bytes: 3785337
num_examples: 5000
- name: all_docs
num_bytes: 5846467
num_examples: 8964
- name: all_docs_eval
num_bytes: 5845967
num_examples: 8964
- name: train
num_bytes: 5846467
num_examples: 8964
- name: validation
num_bytes: 5845967
num_examples: 8964
download_size: 20139574
dataset_size: 32122577
---
# Dataset Card for "lmind_nq_train5000_eval5000_v1_docidx"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_sst2_proximal_distal_demonstratives | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 16915
num_examples: 105
- name: test
num_bytes: 32941
num_examples: 211
- name: train
num_bytes: 431221
num_examples: 3604
download_size: 254519
dataset_size: 481077
---
# Dataset Card for "MULTI_VALUE_sst2_proximal_distal_demonstratives"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Zellic/smart-contract-fiesta | ---
language:
- en
tags:
- solidity
- blockchain
- ethereum
- smart-contract
pretty_name: Zellic Smart Contract Source Index
size_categories:
- 100K<n<1M
---
# Zellic 2023 Smart Contract Source Index
Zellic is making publicly available a dataset of known Ethereum mainnet smart contract source code.
Our aim is to provide a contract source code dataset that is readily available to the public to download in bulk. We believe this dataset will help advance the frontier of smart contract security research. Applications include static analysis, machine learning, and more. This effort is part of Zellic’s mission to create a world with no smart contract hacks.
## Methodology
First, we accumulated a list of all deployed contracts on Ethereum mainnet as of block 16860349. This does not include contracts that have been `SELFDESTRUCT`ed. We progressively built up this index by performing a full sync from the genesis block using the modified Geth instance. Whenever a new contract was created, we added it to our index. When a contract `SELFDESTRUCT`ed, we removed it from the index. This list is available in this dataset as the file `address_bytecodehash_index`.
Next, we collected contract source code from publicly available online sources. All data was obtained from publicly accessible resources.
Finally, we calculated all of the Keccak256 hashes of the deployed runtime EVM bytecode of each contract. We deduplicated contract source code by bytecode hash. In other words, we organized the contract source code set by the bytecode hash of their corresponding verified contracts. For example, if source codes A and B are both verified against smart contracts X and Y with the same deployed EVM bytecode, we only include one of A or B in this dataset. Choosing among duplicates was arbitrarily.
## Dataset Statistics
**Number of unique source codes, by bytecode hash**: 149,386
**Contracts with code available**: 3,897,319 (This is more than the previous number, because MANY contracts share identical bytecode)
**Number of smart contracts in global index**: 30,586,657 (not all have source code available, see Methodology)
| **Chars (wc -c)** | **Words (wc -w)** | **LoC (code)** | **LoC (comments)** | **LoC (whitespace)** | **LoC (total)** |
|-------------------|-------------------|----------------|--------------------|----------------------|-----------------|
| 6,473,548,073 | 712,444,206 | 90,562,628 | 62,503,873 | 24,485,549 | 177,552,050 |
**Unique words**: 939,288
## Dataset Structure
### Index
The `address_bytecodehash_index` file contains a list of known smart contract addresses mapped to the Keccak256 hash of their EVM bytecode.
Look up the smart contract address in this file to find the source. This file also serves as a list of all deployed smart contracts as of block 16860349.
**Not all contracts in the index file will have source code available.** This is a list of **all** deployed smart contracts as of block 16860349. (See Methodology).
Excerpt of data from the index for preview purposes:
```
...
00012e87fa9172d0c613f69d0abf752bb00310ec:4f5a5f6706dc853cb3ae2279729e0d7e24dda128a77358144e4c0fd3e5d60e98
00012c8ef0fef0a06e1644ab91107fe8584fb91e:a828ef7f5f6d2ebb1203de12878e16aa5ba6984c12ededff4e19876233533505
00012df38ea3a6dabefb8407a59219a0c7dd0bc8:c279544d07d9631b1e37d835cadfe7098d60e508cf8f18a89ddb8b176d56874d
00012d92a0e7ee1b19f8e018267c97a3a7e99aa7:0865cec1e9ac3048b12a85fc3b9fbc682c3831784e3396416635df4cb88c3fdd
00012f07e281c1d8a9d790358050b6015eef942c:ab7af4c77ed6371c7eda04ba317a134f0b06593c0dc2851bf4c709a367ea50ed
00012e198745e53293bf09ddec8da1284963fded:ce33220d5c7f0d09d75ceff76c05863c5e7d6e801c70dfe7d5d45d4c44e80654
00012ec2c9fc4a1692176da5202a44a4aea5e177:ce33220d5c7f0d09d75ceff76c05863c5e7d6e801c70dfe7d5d45d4c44e80654
...
```
### Contract Sources
Smart Contract sources are organized by folder in the `organized_contracts` directory.
For example, a contract with the bytecode hash `beef3d7d1884c4fee50548cfe762415fe494e3feb1e6ca181352ef023ba1ff7a` would be in the directory `organized_contracts/be/beef3d7d1884c4fee50548cfe762415fe494e3feb1e6ca181352ef023ba1ff7a/`.
Each folder for a smart contract contains the source files as well as a `metadata.json` that contains information about the contract such as the compiler version and optimizations used. These settings can be used to attempt to reproduce the build.
Example of metadata.json for preview purposes (unminified for ease of viewing):
```json
{
"ContractName": "MageSpace",
"CompilerVersion": "v0.8.10+commit.fc410830",
"Runs": 200,
"OptimizationUsed": false,
"BytecodeHash": "c2f8f4e79a9d7c23d8a398768e1476f03f0e11c44fc7441c021e098c71678d03"
}
```
#### Source Formats
Contracts may come in one of three source formats. Single file, multiple files, and [Solidity Compiler JSON](https://docs.soliditylang.org/en/v0.8.19/using-the-compiler.html#compiler-api).
For multiple file contacts, each `.sol` file will be included in the directory.
Single file contracts will be named `main.sol`. Some contracts are written in Vyper, not Solidity. These will be named `main.vy`.
For Solidity Compiler Input JSON, the compiler input will be stored in `contract.json`.
**Not all contract code is in Solidity. Some contract code is in Vyper, or other languages! Check metadata.json!**
As a quick-and-dirty script, to extract all of the source code, you can use this bash script:
```bash
mkdir code
cd organized_contracts/
for f in * ; do
echo $f
cat $f/*/contract.json | jq '.sources | to_entries[].value.content' -r > ../code/"$f".txt
cat $f/*/*.sol > ../code/"$f".txt
done
```
### Other Fun Facts
Top 100 words:
<details>
<summary>Click to expand</summary>
<pre>
23189252 the
20816285 address
16207663 uint256
14793579 to
13746030 function
9952507 returns
9069124 0
8256548 a
8189582 of
6854095 is
6783298 dev
6363279 return
5555811 if
5497552 memory
5403232 from
5203839 amount
5146685 internal
4838549 value
4753195 be
4700814 external
4676440 owner
4535518 this
4477899 view
4463166 for
4205382 bool
3770805 contract
3732595 token
3719841 and
3578693 public
3447968 string
3422923 tokenid
3243596 require
3134425 1
3063929 in
2996585 bytes
2976900 data
2831472 by
2748878 transfer
2729742 account
2605117 that
2588692 param
2535414 private
2465042 an
2418190 solidity
2377723 uint
2333621 call
2326567 not
2319841 virtual
2295154 zero
2220201 sender
2118342 as
2113922 sol
2024428 target
1945888 event
1919425 s
1901005 or
1899022 pure
1884128 tokens
1859283 must
1850785 it
1796854 with
1783457 contracts
1760318 b
1742610 revert
1711696 spender
1698735 bytes32
1655261 recipient
1645305 i
1608529 indexed
1585283 true
1575421 2
1551352 when
1528254 can
1475879 length
1466789 override
1444666 will
1356364 approve
1355666 8
1314732 notice
1304351 implementation
1293963 are
1291253 import
1290551 on
1267019 balance
1257438 available
1253286 log
1232433 pragma
1211177 since
1193506 msgsender
1193496 result
1190481 liquidity
1185869 msg
1181724 operator
1178211 errormessage
1176497 slot
1156971 set
1154460 openzeppelin
1148764 cannot
1123141 erc20
1115019 abi
</pre>
</details>
## Notices
The smart contract source code in this dataset were obtained from publicly available sources. You should always abide by the appropriate code and software licenses, as well as all applicable copyright law.
THE DATASET/SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE DATASET/SOFTWARE OR THE USE OR OTHER DEALINGS IN THE DATASET/SOFTWARE.
|
anytxt/test | ---
license: other
---
|
BTBurke/2c-short | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 795453.4954577219
num_examples: 1216
- name: test
num_bytes: 140643.50454227813
num_examples: 215
download_size: 359185
dataset_size: 936097.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Multimodal-Fatima/LLM_Description_Vocab_opt_facebook_opt_30b_downstream_tasks | ---
dataset_info:
features:
- name: vocab
dtype: string
- name: descriptions
sequence: string
splits:
- name: test
num_bytes: 528559
num_examples: 3426
download_size: 157247
dataset_size: 528559
---
# Dataset Card for "LLM_Description_Vocab_opt_facebook_opt_30b_downstream_tasks"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
frollo/ItalianCrimeNews | ---
license: mit
---
The dataset contains the main components of the news articles published online by the newspaper named <a href="https://gazzettadimodena.gelocal.it/modena">Gazzetta di Modena</a>: url of the web page, title, sub-title, text, date of publication, crime category assigned to each news article by the author.
The news articles are written in Italian and describe 11 types of crime events occurred in the province of Modena between the end of 2011 and 2021.
Moreover, the dataset includes data derived from the abovementioned components thanks to the application of Natural Language Processing techniques. Some examples are the place of the crime event occurrence (municipality, area, address and GPS coordinates), the date of the occurrence, and the type of the crime events described in the news article obtained by an automatic categorization of the text.
In the end, news articles describing the same crime events (duplciates) are detected by calculating the document similarity.
Now, we are working on the application of question answering to extract the 5W+1H and we plan to extend the current dataset with the obtained data.
Other researchers can employ the dataset to apply other algorithms of text categorization and duplicate detection and compare their results with the benchmark. The dataset can be useful for several scopes, e.g., geo-localization of the events, text summarization, crime analysis, crime prediction, community detection, topic modeling.
|
yuelaiyu/Hanazawa_Kana | ---
license: openrail
---
|
lmqg/qg_esquad | ---
license: cc-by-4.0
pretty_name: SQuAD-es for question generation
language: es
multilinguality: monolingual
size_categories: 10K<n<100K
source_datasets: squad_es
task_categories:
- text-generation
task_ids:
- language-modeling
tags:
- question-generation
---
# Dataset Card for "lmqg/qg_esquad"
## Dataset Description
- **Repository:** [https://github.com/asahi417/lm-question-generation](https://github.com/asahi417/lm-question-generation)
- **Paper:** [https://arxiv.org/abs/2210.03992](https://arxiv.org/abs/2210.03992)
- **Point of Contact:** [Asahi Ushio](http://asahiushio.com/)
### Dataset Summary
This is a subset of [QG-Bench](https://github.com/asahi417/lm-question-generation/blob/master/QG_BENCH.md#datasets), a unified question generation benchmark proposed in
["Generative Language Models for Paragraph-Level Question Generation: A Unified Benchmark and Evaluation, EMNLP 2022 main conference"](https://arxiv.org/abs/2210.03992).
This is a modified version of [SQuAD-es](https://huggingface.co/datasets/squad_es) for question generation (QG) task.
Since the original dataset only contains training/validation set, we manually sample test set from training set, which
has no overlap in terms of the paragraph with the training set.
### Supported Tasks and Leaderboards
* `question-generation`: The dataset is assumed to be used to train a model for question generation.
Success on this task is typically measured by achieving a high BLEU4/METEOR/ROUGE-L/BERTScore/MoverScore (see our paper for more in detail).
### Languages
Spanish (es)
## Dataset Structure
An example of 'train' looks as follows.
```
{
'answer': 'comedia musical',
'question': '¿Qué género de película protagonizó Beyonce con Cuba Gooding, Jr?',
'sentence': 'en la comedia musical ',
'paragraph': 'En julio de 2002, Beyoncé continuó su carrera como actriz interpretando a Foxxy Cleopatra junto a Mike Myers en la película de comedia, Austin Powers in Goldmember, que pasó su primer fin de semana en la cima de la taquilla de Estados Unidos. Beyoncé lanzó "Work It Out" como el primer sencillo de su álbum de banda sonora que entró en el top ten en el Reino Unido, Noruega y Bélgica. En 2003, Knowles protagonizó junto a Cuba Gooding, Jr., en la comedia musical The Fighting Temptations como Lilly, una madre soltera de quien el personaje de Gooding se enamora. Beyoncé lanzó "Fighting Temptation" como el primer sencillo de la banda sonora de la película, con Missy Elliott, MC Lyte y Free que también se utilizó para promocionar la película. Otra de las contribuciones de Beyoncé a la banda sonora, "Summertime", fue mejor en las listas de Estados Unidos.',
'sentence_answer': 'en la <hl> comedia musical <hl> ',
'paragraph_answer': 'En julio de 2002, Beyoncé continuó su carrera como actriz interpretando a Foxxy Cleopatra junto a Mike Myers en la película de comedia, Austin Powers in Goldmember, que pasó su primer fin de semana en la cima de la taquilla de Estados Unidos. Beyoncé lanzó "Work It Out" como el primer sencillo de su álbum de banda sonora que entró en el top ten en el Reino Unido, Noruega y Bélgica. En 2003, Knowles protagonizó junto a Cuba Gooding, Jr., en la <hl> comedia musical <hl> The Fighting Temptations como Lilly, una madre soltera de quien el personaje de Gooding se enamora. Beyoncé lanzó "Fighting Temptation" como el primer sencillo de la banda sonora de la película, con Missy Elliott, MC Lyte y Free que también se utilizó para promocionar la película. Otra de las contribuciones de Beyoncé a la banda sonora, "Summertime", fue mejor en las listas de Estados Unidos.',
'paragraph_sentence': 'En julio de 2002, Beyoncé continuó su carrera como actriz interpretando a Foxxy Cleopatra junto a Mike Myers en la película de comedia, Austin Powers in Goldmember, que pasó su primer fin de semana en la cima de la taquilla de Estados Unidos. Beyoncé lanzó "Work It Out" como el primer sencillo de su álbum de banda sonora que entró en el top ten en el Reino Unido, Noruega y Bélgica. En 2003, Knowles protagonizó junto a Cuba Gooding, Jr. , <hl> en la comedia musical <hl> The Fighting Temptations como Lilly, una madre soltera de quien el personaje de Gooding se enamora. Beyoncé lanzó "Fighting Temptation" como el primer sencillo de la banda sonora de la película, con Missy Elliott, MC Lyte y Free que también se utilizó para promocionar la película. Otra de las contribuciones de Beyoncé a la banda sonora, "Summertime", fue mejor en las listas de Estados Unidos.',
}
```
The data fields are the same among all splits.
- `question`: a `string` feature.
- `paragraph`: a `string` feature.
- `answer`: a `string` feature.
- `sentence`: a `string` feature.
- `paragraph_answer`: a `string` feature, which is same as the paragraph but the answer is highlighted by a special token `<hl>`.
- `paragraph_sentence`: a `string` feature, which is same as the paragraph but a sentence containing the answer is highlighted by a special token `<hl>`.
- `sentence_answer`: a `string` feature, which is same as the sentence but the answer is highlighted by a special token `<hl>`.
Each of `paragraph_answer`, `paragraph_sentence`, and `sentence_answer` feature is assumed to be used to train a question generation model,
but with different information. The `paragraph_answer` and `sentence_answer` features are for answer-aware question generation and
`paragraph_sentence` feature is for sentence-aware question generation.
## Data Splits
|train|validation|test |
|----:|---------:|----:|
|77025| 10570 |10570|
## Citation Information
```
@inproceedings{ushio-etal-2022-generative,
title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration",
author = "Ushio, Asahi and
Alva-Manchego, Fernando and
Camacho-Collados, Jose",
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2022",
address = "Abu Dhabi, U.A.E.",
publisher = "Association for Computational Linguistics",
}
``` |
Maxmioti/GDRP-fines | ---
license: other
---
Opensource DataSet form a Kaggle competition https://www.kaggle.com/datasets/andreibuliga1/gdpr-fines-20182020-updated-23012021
GDPR-fines is a dataset with summary of GDPR cases from companies that were find between 2018 and 2021. You will find the summary plus the Articles violated in the cases (3 most importants + "Others" regrouping the rest of articles).
Raw text and lemmatized text available plus multi-labels. |
MuhammadHelmy/nafsy-QA | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 125662
num_examples: 232
- name: test
num_bytes: 24774
num_examples: 44
download_size: 81984
dataset_size: 150436
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
task_categories:
- question-answering
- text-generation
language:
- ar
tags:
- mental health
- psychology
size_categories:
- n<1K
---
# Dataset Card for nafsy-QA
<!-- Provide a quick summary of the dataset. -->
This is an Arabic QA dataset for mental health. Its orgins comes back to [Nafsy.net](https://nafsy.net/) articles and blogs.
## Dataset Details
**Language(s) (NLP):** Arabic
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
- Supervised Fine-tuning
## Dataset Creation
- GPT-3.5-Turbo has been used to extract question and answer pairs from the original plain text.
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
Creating an arabic chatbot for mental health support.
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
- This dataset was originally scrapped from [Nafsy.net](https://nafsy.net/) then uploaded to Kaggle.
- The QA extraction made on the preprocessed data in my other repo [MuhammadHelmy/nafsy](https://huggingface.co/datasets/MuhammadHelmy/nafsy)
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[husamal](https://www.kaggle.com/husamal)
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
@misc{Husamal_2021, title={Arabic-physcology-dataset}, url={https://www.kaggle.com/datasets/husamal/arabicphyscologydataset?select=nafsy.csv}, journal={Kaggle}, author={Husamal}, year={2021}, month={May}}
## Dataset Card Authors
Muhammad Helmy
## Dataset Card Contact
muhammadhelmymmo@gmail.com |
eitanturok/API-Bench-TorchHub | ---
dataset_info:
- config_name: eval
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: domain
dtype: string
- name: api_call
dtype: string
- name: api_provider
dtype: string
- name: explanation
dtype: string
- name: code
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: raw_text
dtype: string
splits:
- name: train
num_bytes: 210314
num_examples: 186
download_size: 40974
dataset_size: 210314
- config_name: train
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: domain
dtype: string
- name: api_call
dtype: string
- name: api_provider
dtype: string
- name: explanation
dtype: string
- name: code
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: raw_text
dtype: string
splits:
- name: train
num_bytes: 939667
num_examples: 837
download_size: 145995
dataset_size: 939667
configs:
- config_name: eval
data_files:
- split: train
path: eval/train-*
- config_name: train
data_files:
- split: train
path: train/train-*
---
|
Nexdata/accented_english | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
task_categories:
- automatic-speech-recognition
language:
- en
---
# Dataset Card for accented-english
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
## Dataset Description
- **Homepage:** https://nexdata.ai/?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
The dataset contains 20,000 hours of accented English speech data. It's collected from local English speakers in more than 20 countries, such as USA, China, UK, Germany, Japan, India, France, Spain, Russia, Latin America, covering a variety of pronunciation habits and characteristics, accent severity, and the distribution of speakers. The format is 16kHz, 16bit, uncompressed wav, mono channel. The sentence accuracy is over 95%.
For more details, please refer to the link: https://nexdata.ai/speechRecognition?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
English
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commercial License
|
pccl-org/formal-logic-simple-order-new-objects-paired-taller-2000 | ---
dataset_info:
features:
- name: greater_than
dtype: string
- name: less_than
dtype: string
- name: paired_example
sequence:
sequence: string
- name: correct_example
sequence: string
- name: incorrect_example
sequence: string
- name: distance
dtype: int64
- name: index
dtype: int64
- name: index_in_distance
dtype: int64
splits:
- name: train
num_bytes: 506662724
num_examples: 1997003
download_size: 162099930
dataset_size: 506662724
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Leon-LLM/Leon-Chess-Dataset-71k-BOS | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 38993636
num_examples: 71641
download_size: 19959801
dataset_size: 38993636
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Leon-Chess-Dataset-71k-BOS"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mstz/victorian_authorship | ---
language:
- en
tags:
- victorian
- text-classification
pretty_name: Victorian authorship
size_categories:
- 10K<n<100K
task_categories:
- text-classification
license: cc
---
# Victorian authorship
The [Victorian authorship dataset](https://scholarworks.iupui.edu/server/api/core/bitstreams/708a9870-915e-4d59-b54d-938af563c196/content).
Which Victorian author wrote the given text?
# Configurations and tasks
| **Configuration** | **Task** | Description |
|-------------------|---------------------------|---------------------------------------------------------------|
| authorship | Classification | Which Victorian author wrote the given text?|
# Usage
```python
from datasets import load_dataset
dataset = load_dataset("mstz/victorian_authorship", "authorship")["train"]
```
# Features
|**Feature** |**Type** |
|-------------------|---------------|
| text | `[string]` |
# Citation
Cite this dataset as
```
@phdthesis{gungor2018benchmarking,
title={Benchmarking authorship attribution techniques using over a thousand books by fifty victorian era novelists},
author={Gungor, Abdulmecit},
year={2018},
school={Purdue University}
}
``` |
ZhongshengWang/Alpaca-pubmed-summarization | ---
license: openrail
language:
- en
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
task_categories:
- summarization
- text-generation
tags:
- conditional-text-generation
---
This data set is a lightweight fine-tuned data format version of the Llama2 large language model for Stanford Alpaca. You can click [here](https://www.runoob.com) to view.
cite original code
```
@inproceedings{cohan-etal-2018-discourse,
title = "A Discourse-Aware Attention Model for Abstractive Summarization of Long Documents",
author = "Cohan, Arman and
Dernoncourt, Franck and
Kim, Doo Soon and
Bui, Trung and
Kim, Seokhwan and
Chang, Walter and
Goharian, Nazli",
booktitle = "Proceedings of the 2018 Conference of the North {A}merican Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers)",
month = jun,
year = "2018",
address = "New Orleans, Louisiana",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/N18-2097",
doi = "10.18653/v1/N18-2097",
pages = "615--621",
abstract = "Neural abstractive summarization models have led to promising results in summarizing relatively short documents. We propose the first model for abstractive summarization of single, longer-form documents (e.g., research papers). Our approach consists of a new hierarchical encoder that models the discourse structure of a document, and an attentive discourse-aware decoder to generate the summary. Empirical results on two large-scale datasets of scientific papers show that our model significantly outperforms state-of-the-art models.",
}
``` |
totally-not-an-llm/melbourne-20 | ---
license: mit
---
|
Andre040423/vozluanpereira | ---
license: openrail
---
|
autoevaluate/autoeval-eval-jeffdshen__redefine_math2_8shot-jeffdshen__redefine_mat-af4c71-1853163413 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- jeffdshen/redefine_math2_8shot
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-30b_eval
metrics: []
dataset_name: jeffdshen/redefine_math2_8shot
dataset_config: jeffdshen--redefine_math2_8shot
dataset_split: train
col_mapping:
text: prompt
classes: classes
target: answer_index
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-30b_eval
* Dataset: jeffdshen/redefine_math2_8shot
* Config: jeffdshen--redefine_math2_8shot
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@jeffdshen](https://huggingface.co/jeffdshen) for evaluating this model. |
BENBENBENb/sythetic_casual_relation_medium_scale | ---
language:
- en
--- |
jrjyc1/demo | ---
license: openrail
task_categories:
- text-generation
- feature-extraction
language:
- ae
size_categories:
- 10M<n<100M
--- |
autoevaluate/autoeval-eval-autoevaluate__zero-shot-classification-sample-autoevalu-103f11-1986766201 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- autoevaluate/zero-shot-classification-sample
eval_info:
task: text_zero_shot_classification
model: autoevaluate/zero-shot-classification
metrics: ['recall', 'precision']
dataset_name: autoevaluate/zero-shot-classification-sample
dataset_config: autoevaluate--zero-shot-classification-sample
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: autoevaluate/zero-shot-classification
* Dataset: autoevaluate/zero-shot-classification-sample
* Config: autoevaluate--zero-shot-classification-sample
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@MauritsG](https://huggingface.co/MauritsG) for evaluating this model. |
open-llm-leaderboard/details_starmpcc__Asclepius-Llama2-13B | ---
pretty_name: Evaluation run of starmpcc/Asclepius-Llama2-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [starmpcc/Asclepius-Llama2-13B](https://huggingface.co/starmpcc/Asclepius-Llama2-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_starmpcc__Asclepius-Llama2-13B_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-19T12:18:05.781996](https://huggingface.co/datasets/open-llm-leaderboard/details_starmpcc__Asclepius-Llama2-13B_public/blob/main/results_2023-11-19T12-18-05.781996.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5201519972088248,\n\
\ \"acc_stderr\": 0.034051581317112195,\n \"acc_norm\": 0.5290222877161421,\n\
\ \"acc_norm_stderr\": 0.03495991688232667,\n \"mc1\": 0.2827417380660955,\n\
\ \"mc1_stderr\": 0.01576477083677731,\n \"mc2\": 0.4075956796231733,\n\
\ \"mc2_stderr\": 0.015612342660639225,\n \"em\": 0.022546140939597316,\n\
\ \"em_stderr\": 0.0015202810875087338,\n \"f1\": 0.12420616610738253,\n\
\ \"f1_stderr\": 0.002172993439883863\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5324232081911263,\n \"acc_stderr\": 0.014580637569995421,\n\
\ \"acc_norm\": 0.5588737201365188,\n \"acc_norm_stderr\": 0.014509747749064664\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6115315674168492,\n\
\ \"acc_stderr\": 0.004864058877626275,\n \"acc_norm\": 0.7965544712208723,\n\
\ \"acc_norm_stderr\": 0.004017383866405767\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5197368421052632,\n \"acc_stderr\": 0.04065771002562605,\n\
\ \"acc_norm\": 0.5197368421052632,\n \"acc_norm_stderr\": 0.04065771002562605\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.03019761160019795,\n\
\ \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.03019761160019795\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5763888888888888,\n\
\ \"acc_stderr\": 0.0413212501972337,\n \"acc_norm\": 0.5763888888888888,\n\
\ \"acc_norm_stderr\": 0.0413212501972337\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n\
\ \"acc_stderr\": 0.038073017265045125,\n \"acc_norm\": 0.5260115606936416,\n\
\ \"acc_norm_stderr\": 0.038073017265045125\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.04336432707993179,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.04336432707993179\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3439153439153439,\n \"acc_stderr\": 0.024464426625596433,\n \"\
acc_norm\": 0.3439153439153439,\n \"acc_norm_stderr\": 0.024464426625596433\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411018,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411018\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6096774193548387,\n\
\ \"acc_stderr\": 0.027751256636969583,\n \"acc_norm\": 0.6096774193548387,\n\
\ \"acc_norm_stderr\": 0.027751256636969583\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.39901477832512317,\n \"acc_stderr\": 0.03445487686264715,\n\
\ \"acc_norm\": 0.39901477832512317,\n \"acc_norm_stderr\": 0.03445487686264715\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.03713158067481913,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.03713158067481913\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"\
acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7668393782383419,\n \"acc_stderr\": 0.03051611137147601,\n\
\ \"acc_norm\": 0.7668393782383419,\n \"acc_norm_stderr\": 0.03051611137147601\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5307692307692308,\n \"acc_stderr\": 0.025302958890850154,\n\
\ \"acc_norm\": 0.5307692307692308,\n \"acc_norm_stderr\": 0.025302958890850154\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945284,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945284\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5252100840336135,\n \"acc_stderr\": 0.0324371805513741,\n \
\ \"acc_norm\": 0.5252100840336135,\n \"acc_norm_stderr\": 0.0324371805513741\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.708256880733945,\n \"acc_stderr\": 0.019489300968876522,\n \"\
acc_norm\": 0.708256880733945,\n \"acc_norm_stderr\": 0.019489300968876522\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"\
acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7254901960784313,\n \"acc_stderr\": 0.03132179803083292,\n \"\
acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.03132179803083292\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6835443037974683,\n \"acc_stderr\": 0.030274974880218977,\n \
\ \"acc_norm\": 0.6835443037974683,\n \"acc_norm_stderr\": 0.030274974880218977\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5605381165919282,\n\
\ \"acc_stderr\": 0.03331092511038179,\n \"acc_norm\": 0.5605381165919282,\n\
\ \"acc_norm_stderr\": 0.03331092511038179\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.042607351576445594,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.042607351576445594\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5950413223140496,\n \"acc_stderr\": 0.04481137755942469,\n \"\
acc_norm\": 0.5950413223140496,\n \"acc_norm_stderr\": 0.04481137755942469\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.04668408033024931,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.04668408033024931\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6196319018404908,\n \"acc_stderr\": 0.038142698932618374,\n\
\ \"acc_norm\": 0.6196319018404908,\n \"acc_norm_stderr\": 0.038142698932618374\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.04157751539865629,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.04157751539865629\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7350427350427351,\n\
\ \"acc_stderr\": 0.028911208802749475,\n \"acc_norm\": 0.7350427350427351,\n\
\ \"acc_norm_stderr\": 0.028911208802749475\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7318007662835249,\n\
\ \"acc_stderr\": 0.01584243083526942,\n \"acc_norm\": 0.7318007662835249,\n\
\ \"acc_norm_stderr\": 0.01584243083526942\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5578034682080925,\n \"acc_stderr\": 0.026738603643807403,\n\
\ \"acc_norm\": 0.5578034682080925,\n \"acc_norm_stderr\": 0.026738603643807403\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25139664804469275,\n\
\ \"acc_stderr\": 0.014508979453553974,\n \"acc_norm\": 0.25139664804469275,\n\
\ \"acc_norm_stderr\": 0.014508979453553974\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5326797385620915,\n \"acc_stderr\": 0.02856869975222587,\n\
\ \"acc_norm\": 0.5326797385620915,\n \"acc_norm_stderr\": 0.02856869975222587\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.594855305466238,\n\
\ \"acc_stderr\": 0.027882383791325967,\n \"acc_norm\": 0.594855305466238,\n\
\ \"acc_norm_stderr\": 0.027882383791325967\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5864197530864198,\n \"acc_stderr\": 0.02740204204026996,\n\
\ \"acc_norm\": 0.5864197530864198,\n \"acc_norm_stderr\": 0.02740204204026996\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36524822695035464,\n \"acc_stderr\": 0.028723863853281278,\n \
\ \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.028723863853281278\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.37614080834419816,\n\
\ \"acc_stderr\": 0.012372214430599816,\n \"acc_norm\": 0.37614080834419816,\n\
\ \"acc_norm_stderr\": 0.012372214430599816\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5110294117647058,\n \"acc_stderr\": 0.030365446477275675,\n\
\ \"acc_norm\": 0.5110294117647058,\n \"acc_norm_stderr\": 0.030365446477275675\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4934640522875817,\n \"acc_stderr\": 0.020226106567657803,\n \
\ \"acc_norm\": 0.4934640522875817,\n \"acc_norm_stderr\": 0.020226106567657803\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5636363636363636,\n\
\ \"acc_stderr\": 0.04750185058907296,\n \"acc_norm\": 0.5636363636363636,\n\
\ \"acc_norm_stderr\": 0.04750185058907296\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5469387755102041,\n \"acc_stderr\": 0.03186785930004128,\n\
\ \"acc_norm\": 0.5469387755102041,\n \"acc_norm_stderr\": 0.03186785930004128\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\
\ \"acc_stderr\": 0.03115715086935557,\n \"acc_norm\": 0.736318407960199,\n\
\ \"acc_norm_stderr\": 0.03115715086935557\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079023,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079023\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7426900584795322,\n \"acc_stderr\": 0.03352799844161865,\n\
\ \"acc_norm\": 0.7426900584795322,\n \"acc_norm_stderr\": 0.03352799844161865\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2827417380660955,\n\
\ \"mc1_stderr\": 0.01576477083677731,\n \"mc2\": 0.4075956796231733,\n\
\ \"mc2_stderr\": 0.015612342660639225\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7269139700078927,\n \"acc_stderr\": 0.012522020105869456\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.022546140939597316,\n \
\ \"em_stderr\": 0.0015202810875087338,\n \"f1\": 0.12420616610738253,\n\
\ \"f1_stderr\": 0.002172993439883863\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.001516300227445034,\n \"acc_stderr\": 0.0010717793485492627\n\
\ }\n}\n```"
repo_url: https://huggingface.co/starmpcc/Asclepius-Llama2-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|arc:challenge|25_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|drop|3_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|gsm8k|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hellaswag|10_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-19T12-18-05.781996.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-19T12-18-05.781996.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- '**/details_harness|winogrande|5_2023-11-19T12-18-05.781996.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-19T12-18-05.781996.parquet'
- config_name: results
data_files:
- split: 2023_11_19T12_18_05.781996
path:
- results_2023-11-19T12-18-05.781996.parquet
- split: latest
path:
- results_2023-11-19T12-18-05.781996.parquet
---
# Dataset Card for Evaluation run of starmpcc/Asclepius-Llama2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/starmpcc/Asclepius-Llama2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [starmpcc/Asclepius-Llama2-13B](https://huggingface.co/starmpcc/Asclepius-Llama2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_starmpcc__Asclepius-Llama2-13B_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-19T12:18:05.781996](https://huggingface.co/datasets/open-llm-leaderboard/details_starmpcc__Asclepius-Llama2-13B_public/blob/main/results_2023-11-19T12-18-05.781996.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5201519972088248,
"acc_stderr": 0.034051581317112195,
"acc_norm": 0.5290222877161421,
"acc_norm_stderr": 0.03495991688232667,
"mc1": 0.2827417380660955,
"mc1_stderr": 0.01576477083677731,
"mc2": 0.4075956796231733,
"mc2_stderr": 0.015612342660639225,
"em": 0.022546140939597316,
"em_stderr": 0.0015202810875087338,
"f1": 0.12420616610738253,
"f1_stderr": 0.002172993439883863
},
"harness|arc:challenge|25": {
"acc": 0.5324232081911263,
"acc_stderr": 0.014580637569995421,
"acc_norm": 0.5588737201365188,
"acc_norm_stderr": 0.014509747749064664
},
"harness|hellaswag|10": {
"acc": 0.6115315674168492,
"acc_stderr": 0.004864058877626275,
"acc_norm": 0.7965544712208723,
"acc_norm_stderr": 0.004017383866405767
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5197368421052632,
"acc_stderr": 0.04065771002562605,
"acc_norm": 0.5197368421052632,
"acc_norm_stderr": 0.04065771002562605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5962264150943396,
"acc_stderr": 0.03019761160019795,
"acc_norm": 0.5962264150943396,
"acc_norm_stderr": 0.03019761160019795
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5763888888888888,
"acc_stderr": 0.0413212501972337,
"acc_norm": 0.5763888888888888,
"acc_norm_stderr": 0.0413212501972337
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.038073017265045125,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.038073017265045125
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.04336432707993179,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.04336432707993179
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4297872340425532,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.4297872340425532,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3439153439153439,
"acc_stderr": 0.024464426625596433,
"acc_norm": 0.3439153439153439,
"acc_norm_stderr": 0.024464426625596433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6096774193548387,
"acc_stderr": 0.027751256636969583,
"acc_norm": 0.6096774193548387,
"acc_norm_stderr": 0.027751256636969583
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39901477832512317,
"acc_stderr": 0.03445487686264715,
"acc_norm": 0.39901477832512317,
"acc_norm_stderr": 0.03445487686264715
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.03713158067481913,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.03713158067481913
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6868686868686869,
"acc_stderr": 0.033042050878136525,
"acc_norm": 0.6868686868686869,
"acc_norm_stderr": 0.033042050878136525
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7668393782383419,
"acc_stderr": 0.03051611137147601,
"acc_norm": 0.7668393782383419,
"acc_norm_stderr": 0.03051611137147601
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5307692307692308,
"acc_stderr": 0.025302958890850154,
"acc_norm": 0.5307692307692308,
"acc_norm_stderr": 0.025302958890850154
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945284,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945284
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5252100840336135,
"acc_stderr": 0.0324371805513741,
"acc_norm": 0.5252100840336135,
"acc_norm_stderr": 0.0324371805513741
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.708256880733945,
"acc_stderr": 0.019489300968876522,
"acc_norm": 0.708256880733945,
"acc_norm_stderr": 0.019489300968876522
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.03132179803083292,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.03132179803083292
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6835443037974683,
"acc_stderr": 0.030274974880218977,
"acc_norm": 0.6835443037974683,
"acc_norm_stderr": 0.030274974880218977
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5605381165919282,
"acc_stderr": 0.03331092511038179,
"acc_norm": 0.5605381165919282,
"acc_norm_stderr": 0.03331092511038179
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.042607351576445594,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.042607351576445594
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5950413223140496,
"acc_stderr": 0.04481137755942469,
"acc_norm": 0.5950413223140496,
"acc_norm_stderr": 0.04481137755942469
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04668408033024931,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04668408033024931
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6196319018404908,
"acc_stderr": 0.038142698932618374,
"acc_norm": 0.6196319018404908,
"acc_norm_stderr": 0.038142698932618374
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.04157751539865629,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.04157751539865629
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7350427350427351,
"acc_stderr": 0.028911208802749475,
"acc_norm": 0.7350427350427351,
"acc_norm_stderr": 0.028911208802749475
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7318007662835249,
"acc_stderr": 0.01584243083526942,
"acc_norm": 0.7318007662835249,
"acc_norm_stderr": 0.01584243083526942
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5578034682080925,
"acc_stderr": 0.026738603643807403,
"acc_norm": 0.5578034682080925,
"acc_norm_stderr": 0.026738603643807403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25139664804469275,
"acc_stderr": 0.014508979453553974,
"acc_norm": 0.25139664804469275,
"acc_norm_stderr": 0.014508979453553974
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5326797385620915,
"acc_stderr": 0.02856869975222587,
"acc_norm": 0.5326797385620915,
"acc_norm_stderr": 0.02856869975222587
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.594855305466238,
"acc_stderr": 0.027882383791325967,
"acc_norm": 0.594855305466238,
"acc_norm_stderr": 0.027882383791325967
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5864197530864198,
"acc_stderr": 0.02740204204026996,
"acc_norm": 0.5864197530864198,
"acc_norm_stderr": 0.02740204204026996
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36524822695035464,
"acc_stderr": 0.028723863853281278,
"acc_norm": 0.36524822695035464,
"acc_norm_stderr": 0.028723863853281278
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.37614080834419816,
"acc_stderr": 0.012372214430599816,
"acc_norm": 0.37614080834419816,
"acc_norm_stderr": 0.012372214430599816
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5110294117647058,
"acc_stderr": 0.030365446477275675,
"acc_norm": 0.5110294117647058,
"acc_norm_stderr": 0.030365446477275675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4934640522875817,
"acc_stderr": 0.020226106567657803,
"acc_norm": 0.4934640522875817,
"acc_norm_stderr": 0.020226106567657803
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.04750185058907296,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.04750185058907296
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5469387755102041,
"acc_stderr": 0.03186785930004128,
"acc_norm": 0.5469387755102041,
"acc_norm_stderr": 0.03186785930004128
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.03115715086935557,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.03115715086935557
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079023,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079023
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7426900584795322,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.7426900584795322,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2827417380660955,
"mc1_stderr": 0.01576477083677731,
"mc2": 0.4075956796231733,
"mc2_stderr": 0.015612342660639225
},
"harness|winogrande|5": {
"acc": 0.7269139700078927,
"acc_stderr": 0.012522020105869456
},
"harness|drop|3": {
"em": 0.022546140939597316,
"em_stderr": 0.0015202810875087338,
"f1": 0.12420616610738253,
"f1_stderr": 0.002172993439883863
},
"harness|gsm8k|5": {
"acc": 0.001516300227445034,
"acc_stderr": 0.0010717793485492627
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
SSUS/es | ---
license: openrail
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.