datasetId
stringlengths
2
117
card
stringlengths
19
1.01M
Willinton/Code_Llms_HiddenStates
--- license: mit ---
AchrefLearning/big_five_classification
--- license: mit ---
Sakshamrzt/IndicNLP-Punjabi
--- dataset_info: - config_name: default features: - name: headline dtype: string - name: news dtype: string - name: class dtype: float64 splits: - name: train configs: - config_name: default data_files: - split: train path: train.jsonl - config_name: test data_files: - split: test path: test.jsonl ---
CyberHarem/kurosaki_chitose_idolmastercinderellagirls
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of kurosaki_chitose/黒埼ちとせ (THE iDOLM@STER: Cinderella Girls) This is the dataset of kurosaki_chitose/黒埼ちとせ (THE iDOLM@STER: Cinderella Girls), containing 328 images and their tags. The core tags of this character are `blonde_hair, long_hair, bangs, red_eyes, breasts, hair_between_eyes, hairband, very_long_hair, large_breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 328 | 532.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kurosaki_chitose_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 328 | 293.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kurosaki_chitose_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 830 | 626.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kurosaki_chitose_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 328 | 465.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kurosaki_chitose_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 830 | 913.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kurosaki_chitose_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/kurosaki_chitose_idolmastercinderellagirls', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 23 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, long_sleeves, looking_at_viewer, solo, open_mouth, white_shirt, black_hairband, :d, white_background, brooch, frills, simple_background, skirt, upper_body, blush, sleeves_past_wrists, thighhighs | | 1 | 18 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | serafuku, 1girl, long_sleeves, red_neckerchief, shirt, looking_at_viewer, solo, white_sailor_collar, blush, pleated_skirt, black_skirt, black_hairband, open_cardigan, open_mouth, white_background, :d, simple_background | | 2 | 8 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, bare_shoulders, looking_at_viewer, white_dress, smile, solo, strapless_dress, wedding_dress, blush, bridal_veil, bride, cleavage, upper_body, collarbone, closed_mouth, holding_bouquet, necklace, red_rose, white_background | | 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, bare_shoulders, hair_flower, looking_at_viewer, medium_breasts, red_rose, solo, red_dress, smile, blush, cleavage, hair_intakes, wrist_cuffs, nail_polish, petals, red_nails, simple_background, strapless, upper_body, white_background | | 4 | 6 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, blush, looking_at_viewer, navel, solo, cleavage, collarbone, midriff, pink_shirt, groin, long_sleeves, open_mouth, pink_shorts, simple_background, sweat, white_background, :d, crop_top, heart, off_shoulder, pink_eyes, sleeves_past_wrists, stomach | | 5 | 17 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | looking_at_viewer, smile, 1girl, blush, collarbone, navel, solo, cleavage, bikini, bare_shoulders, sitting, black_hairband, closed_mouth, thighs | | 6 | 7 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1boy, 1girl, blush, hetero, solo_focus, sweat, looking_at_viewer, nipples, penis, pov, smile, breast_grab, grabbing, mosaic_censoring, nude, open_mouth, paizuri, black_hairband, collarbone, male_pubic_hair, navel | | 7 | 8 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, fake_animal_ears, playboy_bunny, rabbit_ears, detached_collar, looking_at_viewer, solo, strapless_leotard, black_leotard, cleavage, smile, wrist_cuffs, bare_shoulders, blush, simple_background, thighhighs, white_background, black_hairband, bowtie, covered_navel, medium_breasts, nail_polish, pantyhose, rabbit_tail, red_nails | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | long_sleeves | looking_at_viewer | solo | open_mouth | white_shirt | black_hairband | :d | white_background | brooch | frills | simple_background | skirt | upper_body | blush | sleeves_past_wrists | thighhighs | serafuku | red_neckerchief | shirt | white_sailor_collar | pleated_skirt | black_skirt | open_cardigan | bare_shoulders | white_dress | smile | strapless_dress | wedding_dress | bridal_veil | bride | cleavage | collarbone | closed_mouth | holding_bouquet | necklace | red_rose | hair_flower | medium_breasts | red_dress | hair_intakes | wrist_cuffs | nail_polish | petals | red_nails | strapless | navel | midriff | pink_shirt | groin | pink_shorts | sweat | crop_top | heart | off_shoulder | pink_eyes | stomach | bikini | sitting | thighs | 1boy | hetero | solo_focus | nipples | penis | pov | breast_grab | grabbing | mosaic_censoring | nude | paizuri | male_pubic_hair | fake_animal_ears | playboy_bunny | rabbit_ears | detached_collar | strapless_leotard | black_leotard | bowtie | covered_navel | pantyhose | rabbit_tail | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------------------|:-------|:-------------|:--------------|:-----------------|:-----|:-------------------|:---------|:---------|:--------------------|:--------|:-------------|:--------|:----------------------|:-------------|:-----------|:------------------|:--------|:----------------------|:----------------|:--------------|:----------------|:-----------------|:--------------|:--------|:------------------|:----------------|:--------------|:--------|:-----------|:-------------|:---------------|:------------------|:-----------|:-----------|:--------------|:-----------------|:------------|:---------------|:--------------|:--------------|:---------|:------------|:------------|:--------|:----------|:-------------|:--------|:--------------|:--------|:-----------|:--------|:---------------|:------------|:----------|:---------|:----------|:---------|:-------|:---------|:-------------|:----------|:--------|:------|:--------------|:-----------|:-------------------|:-------|:----------|:------------------|:-------------------|:----------------|:--------------|:------------------|:--------------------|:----------------|:---------|:----------------|:------------|:--------------| | 0 | 23 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 18 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | | X | X | X | | | X | | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 8 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | X | X | | | | | X | | | | | X | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | X | X | | | | | X | | | X | | X | X | | | | | | | | | | X | | X | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 6 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | X | X | X | | | X | X | | | X | | | X | X | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 17 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | X | X | | | X | | | | | | | | X | | | | | | | | | | X | | X | | | | | X | X | X | | | | | | | | | | | | | X | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | 6 | 7 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | X | | X | | X | | | | | | | | X | | | | | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | X | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | 7 | 8 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | | X | X | | | X | | X | | | X | | | X | | X | | | | | | | | X | | X | | | | | X | | | | | | | X | | | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
katxtong/tokenized_coqa_size356
--- dataset_info: features: - name: input_ids sequence: int32 - name: attention_mask sequence: int8 - name: start_positions dtype: int64 - name: end_positions dtype: int64 splits: - name: train num_bytes: 195999188 num_examples: 108647 - name: validation num_bytes: 14401332 num_examples: 7983 download_size: 51708569 dataset_size: 210400520 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* ---
Tsinggu/PolyU-COMP-Information
--- task_categories: - question-answering language: - en size_categories: - n<1K --- The PolyU-COMP-Information is a dataset about the department of computing in PolyU, which contains 370 rows question and answering data.
TheGreatP/minhavozcerto
--- license: openrail ---
fazni/roles-based-on-skills
--- license: mit configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* dataset_info: features: - name: Role dtype: string - name: text dtype: string - name: label dtype: int64 - name: __index_level_0__ dtype: int64 splits: - name: train num_bytes: 2272289 num_examples: 3660 - name: test num_bytes: 577048 num_examples: 916 download_size: 1174905 dataset_size: 2849337 ---
japanese-asr/whisper_transcriptions.reazonspeech.all_59
--- dataset_info: config_name: all features: - name: name dtype: string - name: audio dtype: audio: sampling_rate: 16000 - name: transcription dtype: string - name: whisper_transcript sequence: int64 splits: - name: train num_bytes: 30349117557.0 num_examples: 266948 download_size: 30113495680 dataset_size: 30349117557.0 configs: - config_name: all data_files: - split: train path: all/train-* ---
KolaGang/process_instruct
--- dataset_info: features: - name: instruction dtype: string - name: input dtype: string - name: output dtype: string splits: - name: train num_bytes: 2020141143 num_examples: 274459 download_size: 626897321 dataset_size: 2020141143 configs: - config_name: default data_files: - split: train path: data/train-* ---
jxie/dtd
--- dataset_info: features: - name: image dtype: image - name: label dtype: class_label: names: '0': banded '1': blotchy '2': braided '3': bubbly '4': bumpy '5': chequered '6': cobwebbed '7': cracked '8': crosshatched '9': crystalline '10': dotted '11': fibrous '12': flecked '13': freckled '14': frilly '15': gauzy '16': grid '17': grooved '18': honeycombed '19': interlaced '20': knitted '21': lacelike '22': lined '23': marbled '24': matted '25': meshed '26': paisley '27': perforated '28': pitted '29': pleated '30': polka-dotted '31': porous '32': potholed '33': scaly '34': smeared '35': spiralled '36': sprinkled '37': stained '38': stratified '39': striped '40': studded '41': swirly '42': veined '43': waffled '44': woven '45': wrinkled '46': zigzagged splits: - name: train num_bytes: 226313270.04 num_examples: 1880 - name: test num_bytes: 172035822.0 num_examples: 1880 - name: validation num_bytes: 222278767.48 num_examples: 1880 download_size: 629310459 dataset_size: 620627859.52 --- # Dataset Card for "dtd" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_BFauber__opt125m_10e5_20ep
--- pretty_name: Evaluation run of BFauber/opt125m_10e5_20ep dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [BFauber/opt125m_10e5_20ep](https://huggingface.co/BFauber/opt125m_10e5_20ep)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__opt125m_10e5_20ep\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-02-02T19:31:32.659309](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e5_20ep/blob/main/results_2024-02-02T19-31-32.659309.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23530235034095837,\n\ \ \"acc_stderr\": 0.029883235955403636,\n \"acc_norm\": 0.23550247007959293,\n\ \ \"acc_norm_stderr\": 0.030668377840836165,\n \"mc1\": 0.24112607099143207,\n\ \ \"mc1_stderr\": 0.014974827279752332,\n \"mc2\": 0.4648926603532375,\n\ \ \"mc2_stderr\": 0.01559555646787533\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.22610921501706485,\n \"acc_stderr\": 0.012224202097063269,\n\ \ \"acc_norm\": 0.25426621160409557,\n \"acc_norm_stderr\": 0.012724999945157734\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2847042421828321,\n\ \ \"acc_stderr\": 0.0045035118550500325,\n \"acc_norm\": 0.3084047002589126,\n\ \ \"acc_norm_stderr\": 0.004608907872957696\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \ \ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n\ \ \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n\ \ \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\ \ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.29,\n\ \ \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \ \ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.21132075471698114,\n \"acc_stderr\": 0.025125766484827845,\n\ \ \"acc_norm\": 0.21132075471698114,\n \"acc_norm_stderr\": 0.025125766484827845\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n\ \ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n\ \ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \ \ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\ \ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.16,\n\ \ \"acc_stderr\": 0.0368452949177471,\n \"acc_norm\": 0.16,\n \ \ \"acc_norm_stderr\": 0.0368452949177471\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \ \ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\ \ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\ \ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\ \ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\ \ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.25957446808510637,\n \"acc_stderr\": 0.02865917937429232,\n\ \ \"acc_norm\": 0.25957446808510637,\n \"acc_norm_stderr\": 0.02865917937429232\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\ \ \"acc_stderr\": 0.0404933929774814,\n \"acc_norm\": 0.24561403508771928,\n\ \ \"acc_norm_stderr\": 0.0404933929774814\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.036646663372252565,\n\ \ \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.036646663372252565\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643895,\n \"\ acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643895\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n\ \ \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n\ \ \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \ \ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.27419354838709675,\n \"acc_stderr\": 0.025378139970885193,\n \"\ acc_norm\": 0.27419354838709675,\n \"acc_norm_stderr\": 0.025378139970885193\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\ acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.16,\n \"acc_stderr\": 0.0368452949177471,\n \"acc_norm\"\ : 0.16,\n \"acc_norm_stderr\": 0.0368452949177471\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\ \ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\ acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860657,\n\ \ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860657\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.2205128205128205,\n \"acc_stderr\": 0.021020672680827912,\n\ \ \"acc_norm\": 0.2205128205128205,\n \"acc_norm_stderr\": 0.021020672680827912\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.21481481481481482,\n \"acc_stderr\": 0.02504044387700068,\n \ \ \"acc_norm\": 0.21481481481481482,\n \"acc_norm_stderr\": 0.02504044387700068\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.18907563025210083,\n \"acc_stderr\": 0.02543511943810535,\n\ \ \"acc_norm\": 0.18907563025210083,\n \"acc_norm_stderr\": 0.02543511943810535\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.18543046357615894,\n \"acc_stderr\": 0.03173284384294285,\n \"\ acc_norm\": 0.18543046357615894,\n \"acc_norm_stderr\": 0.03173284384294285\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.2036697247706422,\n \"acc_stderr\": 0.017266742087630793,\n \"\ acc_norm\": 0.2036697247706422,\n \"acc_norm_stderr\": 0.017266742087630793\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.2037037037037037,\n \"acc_stderr\": 0.027467401804057993,\n \"\ acc_norm\": 0.2037037037037037,\n \"acc_norm_stderr\": 0.027467401804057993\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.24509803921568626,\n \"acc_stderr\": 0.030190282453501947,\n \"\ acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.030190282453501947\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.25738396624472576,\n \"acc_stderr\": 0.028458820991460305,\n \ \ \"acc_norm\": 0.25738396624472576,\n \"acc_norm_stderr\": 0.028458820991460305\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3094170403587444,\n\ \ \"acc_stderr\": 0.031024411740572206,\n \"acc_norm\": 0.3094170403587444,\n\ \ \"acc_norm_stderr\": 0.031024411740572206\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\ \ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\ acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\ \ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\ \ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.26380368098159507,\n \"acc_stderr\": 0.03462419931615623,\n\ \ \"acc_norm\": 0.26380368098159507,\n \"acc_norm_stderr\": 0.03462419931615623\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\ \ \"acc_stderr\": 0.04287858751340456,\n \"acc_norm\": 0.2857142857142857,\n\ \ \"acc_norm_stderr\": 0.04287858751340456\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\ \ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2264957264957265,\n\ \ \"acc_stderr\": 0.027421007295392926,\n \"acc_norm\": 0.2264957264957265,\n\ \ \"acc_norm_stderr\": 0.027421007295392926\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2413793103448276,\n\ \ \"acc_stderr\": 0.015302380123542094,\n \"acc_norm\": 0.2413793103448276,\n\ \ \"acc_norm_stderr\": 0.015302380123542094\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\ \ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\ \ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\ \ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.02380518652488815,\n\ \ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02380518652488815\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\ \ \"acc_stderr\": 0.022122439772480768,\n \"acc_norm\": 0.1864951768488746,\n\ \ \"acc_norm_stderr\": 0.022122439772480768\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\ \ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432414,\n \ \ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432414\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2503259452411995,\n\ \ \"acc_stderr\": 0.011064151027165434,\n \"acc_norm\": 0.2503259452411995,\n\ \ \"acc_norm_stderr\": 0.011064151027165434\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.40441176470588236,\n \"acc_stderr\": 0.029812630701569743,\n\ \ \"acc_norm\": 0.40441176470588236,\n \"acc_norm_stderr\": 0.029812630701569743\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\ : 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\ : {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\ \ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\ \ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.24081632653061225,\n\ \ \"acc_stderr\": 0.027372942201788163,\n \"acc_norm\": 0.24081632653061225,\n\ \ \"acc_norm_stderr\": 0.027372942201788163\n },\n \"harness|hendrycksTest-sociology|5\"\ : {\n \"acc\": 0.2736318407960199,\n \"acc_stderr\": 0.03152439186555404,\n\ \ \"acc_norm\": 0.2736318407960199,\n \"acc_norm_stderr\": 0.03152439186555404\n\ \ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\ \ 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n\ \ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-virology|5\"\ : {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\ \ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\ \ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.29239766081871343,\n\ \ \"acc_stderr\": 0.03488647713457921,\n \"acc_norm\": 0.29239766081871343,\n\ \ \"acc_norm_stderr\": 0.03488647713457921\n },\n \"harness|truthfulqa:mc|0\"\ : {\n \"mc1\": 0.24112607099143207,\n \"mc1_stderr\": 0.014974827279752332,\n\ \ \"mc2\": 0.4648926603532375,\n \"mc2_stderr\": 0.01559555646787533\n\ \ },\n \"harness|winogrande|5\": {\n \"acc\": 0.510655090765588,\n\ \ \"acc_stderr\": 0.014049294536290396\n },\n \"harness|gsm8k|5\":\ \ {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```" repo_url: https://huggingface.co/BFauber/opt125m_10e5_20ep leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|arc:challenge|25_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-02-02T19-31-32.659309.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|gsm8k|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hellaswag|10_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-02T19-31-32.659309.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-management|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-virology|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-31-32.659309.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|truthfulqa:mc|0_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-02-02T19-31-32.659309.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_02_02T19_31_32.659309 path: - '**/details_harness|winogrande|5_2024-02-02T19-31-32.659309.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-02-02T19-31-32.659309.parquet' - config_name: results data_files: - split: 2024_02_02T19_31_32.659309 path: - results_2024-02-02T19-31-32.659309.parquet - split: latest path: - results_2024-02-02T19-31-32.659309.parquet --- # Dataset Card for Evaluation run of BFauber/opt125m_10e5_20ep <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [BFauber/opt125m_10e5_20ep](https://huggingface.co/BFauber/opt125m_10e5_20ep) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_BFauber__opt125m_10e5_20ep", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T19:31:32.659309](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e5_20ep/blob/main/results_2024-02-02T19-31-32.659309.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.23530235034095837, "acc_stderr": 0.029883235955403636, "acc_norm": 0.23550247007959293, "acc_norm_stderr": 0.030668377840836165, "mc1": 0.24112607099143207, "mc1_stderr": 0.014974827279752332, "mc2": 0.4648926603532375, "mc2_stderr": 0.01559555646787533 }, "harness|arc:challenge|25": { "acc": 0.22610921501706485, "acc_stderr": 0.012224202097063269, "acc_norm": 0.25426621160409557, "acc_norm_stderr": 0.012724999945157734 }, "harness|hellaswag|10": { "acc": 0.2847042421828321, "acc_stderr": 0.0045035118550500325, "acc_norm": 0.3084047002589126, "acc_norm_stderr": 0.004608907872957696 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.24, "acc_stderr": 0.04292346959909283, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909283 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.23703703703703705, "acc_stderr": 0.03673731683969506, "acc_norm": 0.23703703703703705, "acc_norm_stderr": 0.03673731683969506 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.17763157894736842, "acc_stderr": 0.031103182383123398, "acc_norm": 0.17763157894736842, "acc_norm_stderr": 0.031103182383123398 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.21132075471698114, "acc_stderr": 0.025125766484827845, "acc_norm": 0.21132075471698114, "acc_norm_stderr": 0.025125766484827845 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2361111111111111, "acc_stderr": 0.03551446610810826, "acc_norm": 0.2361111111111111, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.2, "acc_stderr": 0.04020151261036845, "acc_norm": 0.2, "acc_norm_stderr": 0.04020151261036845 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.16, "acc_stderr": 0.0368452949177471, "acc_norm": 0.16, "acc_norm_stderr": 0.0368452949177471 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.20809248554913296, "acc_stderr": 0.030952890217749874, "acc_norm": 0.20809248554913296, "acc_norm_stderr": 0.030952890217749874 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.21568627450980393, "acc_stderr": 0.04092563958237654, "acc_norm": 0.21568627450980393, "acc_norm_stderr": 0.04092563958237654 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.25957446808510637, "acc_stderr": 0.02865917937429232, "acc_norm": 0.25957446808510637, "acc_norm_stderr": 0.02865917937429232 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.24561403508771928, "acc_stderr": 0.0404933929774814, "acc_norm": 0.24561403508771928, "acc_norm_stderr": 0.0404933929774814 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2620689655172414, "acc_stderr": 0.036646663372252565, "acc_norm": 0.2620689655172414, "acc_norm_stderr": 0.036646663372252565 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.25132275132275134, "acc_stderr": 0.022340482339643895, "acc_norm": 0.25132275132275134, "acc_norm_stderr": 0.022340482339643895 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.15079365079365079, "acc_stderr": 0.03200686497287392, "acc_norm": 0.15079365079365079, "acc_norm_stderr": 0.03200686497287392 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.18, "acc_stderr": 0.038612291966536934, "acc_norm": 0.18, "acc_norm_stderr": 0.038612291966536934 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.27419354838709675, "acc_stderr": 0.025378139970885193, "acc_norm": 0.27419354838709675, "acc_norm_stderr": 0.025378139970885193 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.15270935960591134, "acc_stderr": 0.02530890453938063, "acc_norm": 0.15270935960591134, "acc_norm_stderr": 0.02530890453938063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.16, "acc_stderr": 0.0368452949177471, "acc_norm": 0.16, "acc_norm_stderr": 0.0368452949177471 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03225078108306289, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.17676767676767677, "acc_stderr": 0.027178752639044915, "acc_norm": 0.17676767676767677, "acc_norm_stderr": 0.027178752639044915 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.19689119170984457, "acc_stderr": 0.028697873971860657, "acc_norm": 0.19689119170984457, "acc_norm_stderr": 0.028697873971860657 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.2205128205128205, "acc_stderr": 0.021020672680827912, "acc_norm": 0.2205128205128205, "acc_norm_stderr": 0.021020672680827912 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.21481481481481482, "acc_stderr": 0.02504044387700068, "acc_norm": 0.21481481481481482, "acc_norm_stderr": 0.02504044387700068 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.18907563025210083, "acc_stderr": 0.02543511943810535, "acc_norm": 0.18907563025210083, "acc_norm_stderr": 0.02543511943810535 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.18543046357615894, "acc_stderr": 0.03173284384294285, "acc_norm": 0.18543046357615894, "acc_norm_stderr": 0.03173284384294285 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.2036697247706422, "acc_stderr": 0.017266742087630793, "acc_norm": 0.2036697247706422, "acc_norm_stderr": 0.017266742087630793 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.2037037037037037, "acc_stderr": 0.027467401804057993, "acc_norm": 0.2037037037037037, "acc_norm_stderr": 0.027467401804057993 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.24509803921568626, "acc_stderr": 0.030190282453501947, "acc_norm": 0.24509803921568626, "acc_norm_stderr": 0.030190282453501947 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.25738396624472576, "acc_stderr": 0.028458820991460305, "acc_norm": 0.25738396624472576, "acc_norm_stderr": 0.028458820991460305 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.3094170403587444, "acc_stderr": 0.031024411740572206, "acc_norm": 0.3094170403587444, "acc_norm_stderr": 0.031024411740572206 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2595419847328244, "acc_stderr": 0.03844876139785271, "acc_norm": 0.2595419847328244, "acc_norm_stderr": 0.03844876139785271 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2396694214876033, "acc_stderr": 0.03896878985070417, "acc_norm": 0.2396694214876033, "acc_norm_stderr": 0.03896878985070417 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.25925925925925924, "acc_stderr": 0.042365112580946336, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.042365112580946336 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.26380368098159507, "acc_stderr": 0.03462419931615623, "acc_norm": 0.26380368098159507, "acc_norm_stderr": 0.03462419931615623 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.2857142857142857, "acc_stderr": 0.04287858751340456, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.04287858751340456 }, "harness|hendrycksTest-management|5": { "acc": 0.17475728155339806, "acc_stderr": 0.037601780060266224, "acc_norm": 0.17475728155339806, "acc_norm_stderr": 0.037601780060266224 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2264957264957265, "acc_stderr": 0.027421007295392926, "acc_norm": 0.2264957264957265, "acc_norm_stderr": 0.027421007295392926 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.2413793103448276, "acc_stderr": 0.015302380123542094, "acc_norm": 0.2413793103448276, "acc_norm_stderr": 0.015302380123542094 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.24855491329479767, "acc_stderr": 0.023267528432100174, "acc_norm": 0.24855491329479767, "acc_norm_stderr": 0.023267528432100174 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23798882681564246, "acc_stderr": 0.014242630070574915, "acc_norm": 0.23798882681564246, "acc_norm_stderr": 0.014242630070574915 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.2222222222222222, "acc_stderr": 0.02380518652488815, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.02380518652488815 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.1864951768488746, "acc_stderr": 0.022122439772480768, "acc_norm": 0.1864951768488746, "acc_norm_stderr": 0.022122439772480768 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.21604938271604937, "acc_stderr": 0.022899162918445806, "acc_norm": 0.21604938271604937, "acc_norm_stderr": 0.022899162918445806 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.23404255319148937, "acc_stderr": 0.025257861359432414, "acc_norm": 0.23404255319148937, "acc_norm_stderr": 0.025257861359432414 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2503259452411995, "acc_stderr": 0.011064151027165434, "acc_norm": 0.2503259452411995, "acc_norm_stderr": 0.011064151027165434 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.40441176470588236, "acc_stderr": 0.029812630701569743, "acc_norm": 0.40441176470588236, "acc_norm_stderr": 0.029812630701569743 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.25, "acc_stderr": 0.01751781884501444, "acc_norm": 0.25, "acc_norm_stderr": 0.01751781884501444 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03955932861795833, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03955932861795833 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.24081632653061225, "acc_stderr": 0.027372942201788163, "acc_norm": 0.24081632653061225, "acc_norm_stderr": 0.027372942201788163 }, "harness|hendrycksTest-sociology|5": { "acc": 0.2736318407960199, "acc_stderr": 0.03152439186555404, "acc_norm": 0.2736318407960199, "acc_norm_stderr": 0.03152439186555404 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.27, "acc_stderr": 0.0446196043338474, "acc_norm": 0.27, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-virology|5": { "acc": 0.28313253012048195, "acc_stderr": 0.03507295431370518, "acc_norm": 0.28313253012048195, "acc_norm_stderr": 0.03507295431370518 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.29239766081871343, "acc_stderr": 0.03488647713457921, "acc_norm": 0.29239766081871343, "acc_norm_stderr": 0.03488647713457921 }, "harness|truthfulqa:mc|0": { "mc1": 0.24112607099143207, "mc1_stderr": 0.014974827279752332, "mc2": 0.4648926603532375, "mc2_stderr": 0.01559555646787533 }, "harness|winogrande|5": { "acc": 0.510655090765588, "acc_stderr": 0.014049294536290396 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
dmrau/trec_dl20-qrels
--- dataset_info: features: - name: query-id dtype: string - name: corpus-id dtype: string - name: score dtype: string splits: - name: test num_bytes: 298319 num_examples: 11386 download_size: 0 dataset_size: 298319 configs: - config_name: default data_files: - split: test path: data/test-* --- # Dataset Card for "trec_dl20-qrels" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
maxolotl/must-c-en-fr-wait07_22.23
--- dataset_info: features: - name: current_source dtype: string - name: current_target dtype: string - name: target_token dtype: string splits: - name: train num_bytes: 1157797754 num_examples: 5530635 - name: test num_bytes: 12864307 num_examples: 64317 - name: validation num_bytes: 6034981 num_examples: 29172 download_size: 182901094 dataset_size: 1176697042 --- # Dataset Card for "must-c-en-fr-wait07_22.23" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Bruna1221/RVC_Models_by_Bruna1221
--- license: cc-by-2.0 pretty_name: RVC_Models size_categories: - n<1K ---
polinaeterna/test-windows
--- builder_config: data_files: - split: train pattern: data/train-* - split: random pattern: data/random-* dataset_info: features: - name: x dtype: int64 - name: y dtype: int64 splits: - name: train num_bytes: 16000 num_examples: 1000 - name: random num_bytes: 1600 num_examples: 100 download_size: 0 dataset_size: 17600 --- # Dataset Card for "test-windows" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
autoevaluate/autoeval-eval-squad_v2-squad_v2-8571ec-1652758614
--- type: predictions tags: - autotrain - evaluation datasets: - squad_v2 eval_info: task: extractive_question_answering model: Palak/microsoft_deberta-base_squad metrics: [] dataset_name: squad_v2 dataset_config: squad_v2 dataset_split: validation col_mapping: context: context question: question answers-text: answers.text answers-answer_start: answers.answer_start --- # Dataset Card for AutoTrain Evaluator This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset: * Task: Question Answering * Model: Palak/microsoft_deberta-base_squad * Dataset: squad_v2 * Config: squad_v2 * Split: validation To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator). ## Contributions Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model.
DavidVivancos/MindBigData2022_Imagenet_IN_Spct
--- license: odbl ---
Sagar0934/guanaco-llama2-1k
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 1654448 num_examples: 1000 download_size: 0 dataset_size: 1654448 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "guanaco-llama2-1k" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
DeliberatorArchiver/discography_v2_cdn
--- license: cc-by-nc-nd-4.0 viewer: false --- # discography_v2_cdn Archive of rebuilt music database Using UUID version 7 ([uuidv7](https://github.com/LiosK/uuidv7))
amanteur/CHAD_hummings
--- license: cc-by-nc-4.0 task_categories: - feature-extraction tags: - music size_categories: - 1K<n<10K viewer: false --- # CHAD-Hummings Subset This repository contains the hummings subset of the dataset from ["A Semi-Supervised Deep Learning Approach to Dataset Collection for Query-by-Humming Task"]() (ISMIR 2023). For the complete dataset and further details, please visit the main [GitHub repository](https://github.com/amanteur/CHAD#hummings). --- # Overview The `chad_hummings_subset.tar.gz` archive provided in this repository contains a collection of 5,314 humming audio files. These audio files are sorted into groups of 693 distinct humming fragments originating from 311 unique songs (groups). Audio format - `.wav`. --- # Dataset Structure Upon extracting the dataset from `chad_hummings_subset.tar.gz`, you will find the following structured hierarchy: ``` ├── {GROUP_ID} │ ├── {FRAGMENT_ID} │ ├── {ID}.wav │ └── ... │ └── ... └── ... ``` where - `GROUP_ID` - the unique identifier for each song, - `FRAGMENT_ID` - the identifier for individual fragments within a song, - `ID` - the version identifier for a specific fragment of the song. This structured hierarchy organizes the audio files and fragments, making it easier to navigate and work with the dataset. --- # Citation Please cite the following paper if you use the code or dataset provided in this repository. ```bibtex @inproceedings{Amatov2023, title={A Semi-Supervised Deep Learning Approach to Dataset Collection for Query-by-Humming Task}, author={Amatov, Amantur and Lamanov, Dmitry and Titov, Maksim and Vovk, Ivan and Makarov, Ilya and Kudinov, Mikhail}, year={2023}, } ```
mahdibaghbanzadeh/GUE_splice_reconstructed
--- dataset_info: features: - name: sequence dtype: string - name: labels dtype: class_label: names: '0': '0' '1': '1' '2': '2' splits: - name: train num_bytes: 15036352 num_examples: 36496 - name: val num_bytes: 1879544 num_examples: 4562 - name: test num_bytes: 1879544 num_examples: 4562 download_size: 8806003 dataset_size: 18795440 configs: - config_name: default data_files: - split: train path: data/train-* - split: val path: data/val-* - split: test path: data/test-* ---
Atipico1/mrqa_squad-tqa-sqa
--- dataset_info: features: - name: subset dtype: string - name: context dtype: string - name: qid dtype: string - name: question dtype: string - name: detected_answers struct: - name: char_spans list: - name: end sequence: int64 - name: start sequence: int64 - name: text sequence: string - name: token_spans list: - name: end sequence: int64 - name: start sequence: int64 - name: answers sequence: string splits: - name: train num_bytes: 873312545 num_examples: 265660 download_size: 470656859 dataset_size: 873312545 configs: - config_name: default data_files: - split: train path: data/train-* ---
srinathmkce/CarAssistant
--- license: apache-2.0 ---
jouyang/clevr_1000
--- license: mit ---
Atul790/dress-lora3
--- dataset_info: features: - name: image dtype: image splits: - name: train num_bytes: 5007966.0 num_examples: 19 download_size: 5009725 dataset_size: 5007966.0 configs: - config_name: default data_files: - split: train path: data/train-* ---
bigheiniuJ/EvalMetaICL
--- dataset_info: features: - name: task dtype: string - name: input dtype: string - name: output dtype: string - name: options sequence: string - name: seed dtype: string - name: split dtype: string splits: - name: meta_eval num_bytes: 568291690 num_examples: 984390 - name: meta_eval_100shot num_bytes: 587900520 num_examples: 1035630 - name: meta_train num_bytes: 162025836 num_examples: 384022 download_size: 253960910 dataset_size: 1318218046 --- # Dataset Card for "EvalMetaICL" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
pharaouk/dharma_test2
--- configs: - config_name: default data_files: - split: 'dharma_test2_shuffled' path: final/dharma_test2_eval_shuffled* - split: 'dharma_test2_unshuffled' path: final/dharma_test2_eval_unshuffled* --- # "dharma_test2 Dataset" A dharma evaluation dataset with the following configuration: ||| Subject: MMLU, Size: 12 ||| ||| Subject: ARC-Challenge, Size: 12 ||| ||| Subject: ARC-Easy, Size: 12 ||| ||| Subject: BoolQ, Size: 12 ||| ||| Subject: winogrande, Size: 12 ||| ||| Subject: openbookqa, Size: 12 ||| ||| Subject: truthful_qa, Size: 12 ||| ||| Subject: agieval, Size: 12 ||| Made with https://github.com/pharaouk/dharma 🚀
unrealMJ/douyin
--- license: apache-2.0 ---
jstet/laouenan-notable-people
--- license: cc-by-sa-4.0 --- Laouenan, M., Bhargava, P., Eymeoud, J.-B., Gergaud, O., Plique, G., & Wasmer, E. (2023). A Brief History of Human Time - Cross-verified Dataset. data.sciencespo. doi: 10.21410/7E4/RDAG3O
Eitanli/wine_type
--- dataset_info: features: - name: id dtype: int64 - name: recipe dtype: string - name: wine_type dtype: string splits: - name: train num_bytes: 110426494 num_examples: 74465 download_size: 54694496 dataset_size: 110426494 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "wine_type" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_INSAIT-Institute__BgGPT-7B-Instruct-v0.2
--- pretty_name: Evaluation run of INSAIT-Institute/BgGPT-7B-Instruct-v0.2 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [INSAIT-Institute/BgGPT-7B-Instruct-v0.2](https://huggingface.co/INSAIT-Institute/BgGPT-7B-Instruct-v0.2)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_INSAIT-Institute__BgGPT-7B-Instruct-v0.2\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-03-03T23:20:48.301798](https://huggingface.co/datasets/open-llm-leaderboard/details_INSAIT-Institute__BgGPT-7B-Instruct-v0.2/blob/main/results_2024-03-03T23-20-48.301798.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6046331936407989,\n\ \ \"acc_stderr\": 0.032980055225792684,\n \"acc_norm\": 0.608685542837201,\n\ \ \"acc_norm_stderr\": 0.0336435169198773,\n \"mc1\": 0.37576499388004897,\n\ \ \"mc1_stderr\": 0.016954584060214297,\n \"mc2\": 0.5462834271948728,\n\ \ \"mc2_stderr\": 0.015488298895953717\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5614334470989761,\n \"acc_stderr\": 0.014500682618212865,\n\ \ \"acc_norm\": 0.60580204778157,\n \"acc_norm_stderr\": 0.01428052266746732\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6303525194184425,\n\ \ \"acc_stderr\": 0.00481722729224028,\n \"acc_norm\": 0.8218482374029078,\n\ \ \"acc_norm_stderr\": 0.0038185843846355286\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\ \ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5481481481481482,\n\ \ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.5481481481481482,\n\ \ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926605,\n\ \ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926605\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\ \ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \ \ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\ \ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n\ \ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n\ \ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \ \ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n\ \ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \ \ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\ \ \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n\ \ \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077615,\n\ \ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077615\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n\ \ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.03267862331014063,\n\ \ \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.03267862331014063\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\ \ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\ \ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\ \ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"\ acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\ \ \"acc_stderr\": 0.043435254289490965,\n \"acc_norm\": 0.38095238095238093,\n\ \ \"acc_norm_stderr\": 0.043435254289490965\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \ \ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7419354838709677,\n\ \ \"acc_stderr\": 0.02489246917246283,\n \"acc_norm\": 0.7419354838709677,\n\ \ \"acc_norm_stderr\": 0.02489246917246283\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\ \ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\ : 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\ \ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"\ acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n\ \ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6230769230769231,\n \"acc_stderr\": 0.024570975364225995,\n\ \ \"acc_norm\": 0.6230769230769231,\n \"acc_norm_stderr\": 0.024570975364225995\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \ \ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6260504201680672,\n \"acc_stderr\": 0.03142946637883708,\n \ \ \"acc_norm\": 0.6260504201680672,\n \"acc_norm_stderr\": 0.03142946637883708\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\ acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8165137614678899,\n \"acc_stderr\": 0.01659525971039932,\n \"\ acc_norm\": 0.8165137614678899,\n \"acc_norm_stderr\": 0.01659525971039932\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854052,\n \"\ acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854052\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640773,\n \"\ acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640773\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \ \ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\ \ \"acc_stderr\": 0.030769352008229146,\n \"acc_norm\": 0.6995515695067265,\n\ \ \"acc_norm_stderr\": 0.030769352008229146\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677697,\n\ \ \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677697\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516301,\n \"\ acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516301\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\ \ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\ \ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\ \ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\ \ \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n\ \ \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\ \ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\ \ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\ \ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7879948914431673,\n\ \ \"acc_stderr\": 0.014616099385833671,\n \"acc_norm\": 0.7879948914431673,\n\ \ \"acc_norm_stderr\": 0.014616099385833671\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6502890173410405,\n \"acc_stderr\": 0.025674281456531018,\n\ \ \"acc_norm\": 0.6502890173410405,\n \"acc_norm_stderr\": 0.025674281456531018\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37206703910614525,\n\ \ \"acc_stderr\": 0.0161658475835633,\n \"acc_norm\": 0.37206703910614525,\n\ \ \"acc_norm_stderr\": 0.0161658475835633\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.026857294663281416,\n\ \ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.026857294663281416\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n\ \ \"acc_stderr\": 0.02698147804364805,\n \"acc_norm\": 0.6559485530546624,\n\ \ \"acc_norm_stderr\": 0.02698147804364805\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799208,\n\ \ \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799208\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666907,\n \ \ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666907\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44002607561929596,\n\ \ \"acc_stderr\": 0.012678037478574513,\n \"acc_norm\": 0.44002607561929596,\n\ \ \"acc_norm_stderr\": 0.012678037478574513\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.02934980313976587,\n\ \ \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.02934980313976587\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6094771241830066,\n \"acc_stderr\": 0.019737008998094597,\n \ \ \"acc_norm\": 0.6094771241830066,\n \"acc_norm_stderr\": 0.019737008998094597\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\ \ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\ \ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6816326530612244,\n \"acc_stderr\": 0.029822533793982073,\n\ \ \"acc_norm\": 0.6816326530612244,\n \"acc_norm_stderr\": 0.029822533793982073\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7711442786069652,\n\ \ \"acc_stderr\": 0.029705284056772426,\n \"acc_norm\": 0.7711442786069652,\n\ \ \"acc_norm_stderr\": 0.029705284056772426\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036623,\n \ \ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036623\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \ \ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"\ acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.031267817146631786,\n\ \ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.031267817146631786\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37576499388004897,\n\ \ \"mc1_stderr\": 0.016954584060214297,\n \"mc2\": 0.5462834271948728,\n\ \ \"mc2_stderr\": 0.015488298895953717\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7647987371744278,\n \"acc_stderr\": 0.01192000816365088\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.44124336618650495,\n \ \ \"acc_stderr\": 0.013677059478592636\n }\n}\n```" repo_url: https://huggingface.co/INSAIT-Institute/BgGPT-7B-Instruct-v0.2 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|arc:challenge|25_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-03-03T23-20-48.301798.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|gsm8k|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hellaswag|10_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-03T23-20-48.301798.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-management|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-virology|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T23-20-48.301798.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|truthfulqa:mc|0_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-03-03T23-20-48.301798.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_03_03T23_20_48.301798 path: - '**/details_harness|winogrande|5_2024-03-03T23-20-48.301798.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-03-03T23-20-48.301798.parquet' - config_name: results data_files: - split: 2024_03_03T23_20_48.301798 path: - results_2024-03-03T23-20-48.301798.parquet - split: latest path: - results_2024-03-03T23-20-48.301798.parquet --- # Dataset Card for Evaluation run of INSAIT-Institute/BgGPT-7B-Instruct-v0.2 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [INSAIT-Institute/BgGPT-7B-Instruct-v0.2](https://huggingface.co/INSAIT-Institute/BgGPT-7B-Instruct-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_INSAIT-Institute__BgGPT-7B-Instruct-v0.2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-03-03T23:20:48.301798](https://huggingface.co/datasets/open-llm-leaderboard/details_INSAIT-Institute__BgGPT-7B-Instruct-v0.2/blob/main/results_2024-03-03T23-20-48.301798.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6046331936407989, "acc_stderr": 0.032980055225792684, "acc_norm": 0.608685542837201, "acc_norm_stderr": 0.0336435169198773, "mc1": 0.37576499388004897, "mc1_stderr": 0.016954584060214297, "mc2": 0.5462834271948728, "mc2_stderr": 0.015488298895953717 }, "harness|arc:challenge|25": { "acc": 0.5614334470989761, "acc_stderr": 0.014500682618212865, "acc_norm": 0.60580204778157, "acc_norm_stderr": 0.01428052266746732 }, "harness|hellaswag|10": { "acc": 0.6303525194184425, "acc_stderr": 0.00481722729224028, "acc_norm": 0.8218482374029078, "acc_norm_stderr": 0.0038185843846355286 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5481481481481482, "acc_stderr": 0.04299268905480864, "acc_norm": 0.5481481481481482, "acc_norm_stderr": 0.04299268905480864 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6710526315789473, "acc_stderr": 0.03823428969926605, "acc_norm": 0.6710526315789473, "acc_norm_stderr": 0.03823428969926605 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.6, "acc_stderr": 0.04923659639173309, "acc_norm": 0.6, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6867924528301886, "acc_stderr": 0.028544793319055326, "acc_norm": 0.6867924528301886, "acc_norm_stderr": 0.028544793319055326 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6875, "acc_stderr": 0.038760854559127644, "acc_norm": 0.6875, "acc_norm_stderr": 0.038760854559127644 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6127167630057804, "acc_stderr": 0.03714325906302065, "acc_norm": 0.6127167630057804, "acc_norm_stderr": 0.03714325906302065 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.30392156862745096, "acc_stderr": 0.045766654032077615, "acc_norm": 0.30392156862745096, "acc_norm_stderr": 0.045766654032077615 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.71, "acc_stderr": 0.04560480215720684, "acc_norm": 0.71, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5106382978723404, "acc_stderr": 0.03267862331014063, "acc_norm": 0.5106382978723404, "acc_norm_stderr": 0.03267862331014063 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.43859649122807015, "acc_stderr": 0.04668000738510455, "acc_norm": 0.43859649122807015, "acc_norm_stderr": 0.04668000738510455 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4689655172413793, "acc_stderr": 0.04158632762097828, "acc_norm": 0.4689655172413793, "acc_norm_stderr": 0.04158632762097828 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42592592592592593, "acc_stderr": 0.02546714904546955, "acc_norm": 0.42592592592592593, "acc_norm_stderr": 0.02546714904546955 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.38095238095238093, "acc_stderr": 0.043435254289490965, "acc_norm": 0.38095238095238093, "acc_norm_stderr": 0.043435254289490965 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7419354838709677, "acc_stderr": 0.02489246917246283, "acc_norm": 0.7419354838709677, "acc_norm_stderr": 0.02489246917246283 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4876847290640394, "acc_stderr": 0.035169204442208966, "acc_norm": 0.4876847290640394, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.032876667586034906, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.032876667586034906 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7727272727272727, "acc_stderr": 0.029857515673386414, "acc_norm": 0.7727272727272727, "acc_norm_stderr": 0.029857515673386414 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8601036269430051, "acc_stderr": 0.025033870583015178, "acc_norm": 0.8601036269430051, "acc_norm_stderr": 0.025033870583015178 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6230769230769231, "acc_stderr": 0.024570975364225995, "acc_norm": 0.6230769230769231, "acc_norm_stderr": 0.024570975364225995 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2740740740740741, "acc_stderr": 0.027195934804085626, "acc_norm": 0.2740740740740741, "acc_norm_stderr": 0.027195934804085626 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6260504201680672, "acc_stderr": 0.03142946637883708, "acc_norm": 0.6260504201680672, "acc_norm_stderr": 0.03142946637883708 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33112582781456956, "acc_stderr": 0.038425817186598696, "acc_norm": 0.33112582781456956, "acc_norm_stderr": 0.038425817186598696 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8165137614678899, "acc_stderr": 0.01659525971039932, "acc_norm": 0.8165137614678899, "acc_norm_stderr": 0.01659525971039932 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.48148148148148145, "acc_stderr": 0.03407632093854052, "acc_norm": 0.48148148148148145, "acc_norm_stderr": 0.03407632093854052 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8088235294117647, "acc_stderr": 0.027599174300640773, "acc_norm": 0.8088235294117647, "acc_norm_stderr": 0.027599174300640773 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7974683544303798, "acc_stderr": 0.026160568246601443, "acc_norm": 0.7974683544303798, "acc_norm_stderr": 0.026160568246601443 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6995515695067265, "acc_stderr": 0.030769352008229146, "acc_norm": 0.6995515695067265, "acc_norm_stderr": 0.030769352008229146 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6870229007633588, "acc_stderr": 0.04066962905677697, "acc_norm": 0.6870229007633588, "acc_norm_stderr": 0.04066962905677697 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7520661157024794, "acc_stderr": 0.03941897526516301, "acc_norm": 0.7520661157024794, "acc_norm_stderr": 0.03941897526516301 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252626, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252626 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7239263803680982, "acc_stderr": 0.035123852837050475, "acc_norm": 0.7239263803680982, "acc_norm_stderr": 0.035123852837050475 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.41964285714285715, "acc_stderr": 0.046840993210771065, "acc_norm": 0.41964285714285715, "acc_norm_stderr": 0.046840993210771065 }, "harness|hendrycksTest-management|5": { "acc": 0.7281553398058253, "acc_stderr": 0.044052680241409216, "acc_norm": 0.7281553398058253, "acc_norm_stderr": 0.044052680241409216 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8504273504273504, "acc_stderr": 0.023365051491753715, "acc_norm": 0.8504273504273504, "acc_norm_stderr": 0.023365051491753715 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.68, "acc_stderr": 0.046882617226215034, "acc_norm": 0.68, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7879948914431673, "acc_stderr": 0.014616099385833671, "acc_norm": 0.7879948914431673, "acc_norm_stderr": 0.014616099385833671 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6502890173410405, "acc_stderr": 0.025674281456531018, "acc_norm": 0.6502890173410405, "acc_norm_stderr": 0.025674281456531018 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.37206703910614525, "acc_stderr": 0.0161658475835633, "acc_norm": 0.37206703910614525, "acc_norm_stderr": 0.0161658475835633 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.673202614379085, "acc_stderr": 0.026857294663281416, "acc_norm": 0.673202614379085, "acc_norm_stderr": 0.026857294663281416 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6559485530546624, "acc_stderr": 0.02698147804364805, "acc_norm": 0.6559485530546624, "acc_norm_stderr": 0.02698147804364805 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7191358024691358, "acc_stderr": 0.025006469755799208, "acc_norm": 0.7191358024691358, "acc_norm_stderr": 0.025006469755799208 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.44680851063829785, "acc_stderr": 0.029658235097666907, "acc_norm": 0.44680851063829785, "acc_norm_stderr": 0.029658235097666907 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.44002607561929596, "acc_stderr": 0.012678037478574513, "acc_norm": 0.44002607561929596, "acc_norm_stderr": 0.012678037478574513 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6286764705882353, "acc_stderr": 0.02934980313976587, "acc_norm": 0.6286764705882353, "acc_norm_stderr": 0.02934980313976587 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6094771241830066, "acc_stderr": 0.019737008998094597, "acc_norm": 0.6094771241830066, "acc_norm_stderr": 0.019737008998094597 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6816326530612244, "acc_stderr": 0.029822533793982073, "acc_norm": 0.6816326530612244, "acc_norm_stderr": 0.029822533793982073 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7711442786069652, "acc_stderr": 0.029705284056772426, "acc_norm": 0.7711442786069652, "acc_norm_stderr": 0.029705284056772426 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.81, "acc_stderr": 0.03942772444036623, "acc_norm": 0.81, "acc_norm_stderr": 0.03942772444036623 }, "harness|hendrycksTest-virology|5": { "acc": 0.5, "acc_stderr": 0.03892494720807614, "acc_norm": 0.5, "acc_norm_stderr": 0.03892494720807614 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7894736842105263, "acc_stderr": 0.031267817146631786, "acc_norm": 0.7894736842105263, "acc_norm_stderr": 0.031267817146631786 }, "harness|truthfulqa:mc|0": { "mc1": 0.37576499388004897, "mc1_stderr": 0.016954584060214297, "mc2": 0.5462834271948728, "mc2_stderr": 0.015488298895953717 }, "harness|winogrande|5": { "acc": 0.7647987371744278, "acc_stderr": 0.01192000816365088 }, "harness|gsm8k|5": { "acc": 0.44124336618650495, "acc_stderr": 0.013677059478592636 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
Multimodal-Fatima/Caltech101_with_background_test_facebook_opt_6.7b_Attributes_Caption_ns_6084_random
--- dataset_info: features: - name: id dtype: int64 - name: image dtype: image - name: prompt dtype: string - name: true_label dtype: string - name: prediction dtype: string - name: scores sequence: float64 splits: - name: fewshot_1_bs_16 num_bytes: 102753879.5 num_examples: 6084 - name: fewshot_3_bs_16 num_bytes: 105999857.5 num_examples: 6084 download_size: 193316942 dataset_size: 208753737.0 --- # Dataset Card for "Caltech101_with_background_test_facebook_opt_6.7b_Attributes_Caption_ns_6084_random" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
liuyanchen1015/MULTI_VALUE_cola_will_would
--- dataset_info: features: - name: sentence dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: value_score dtype: int64 splits: - name: dev num_bytes: 3702 num_examples: 41 - name: test num_bytes: 3560 num_examples: 39 - name: train num_bytes: 30271 num_examples: 370 download_size: 21681 dataset_size: 37533 --- # Dataset Card for "MULTI_VALUE_cola_will_would" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
CyberHarem/bubble_arknights
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of bubble/バブル/泡泡 (Arknights) This is the dataset of bubble/バブル/泡泡 (Arknights), containing 17 images and their tags. The core tags of this character are `brown_hair, long_hair, horns, single_horn, ponytail, animal_ears, bow, hair_ornament, hair_bow, horse_ears, hairclip, green_eyes, horse_girl`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 17 | 18.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bubble_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 1200 | 17 | 16.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bubble_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 39 | 32.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bubble_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/bubble_arknights', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------| | 0 | 17 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, smile, open_mouth, armor, looking_at_viewer, gloves, white_background, blush, full_body, holding | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | open_mouth | armor | looking_at_viewer | gloves | white_background | blush | full_body | holding | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:-------------|:--------|:--------------------|:---------|:-------------------|:--------|:------------|:----------| | 0 | 17 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X |
autoevaluate/autoeval-eval-acronym_identification-default-d87697-95015146250
--- type: predictions tags: - autotrain - evaluation datasets: - acronym_identification eval_info: task: entity_extraction model: lewtun/autotrain-acronym-identification-7324788 metrics: ['code_eval', 'lvwerra/ai4code'] dataset_name: acronym_identification dataset_config: default dataset_split: train col_mapping: tokens: tokens tags: labels --- # Dataset Card for AutoTrain Evaluator This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset: * Task: Token Classification * Model: lewtun/autotrain-acronym-identification-7324788 * Dataset: acronym_identification * Config: default * Split: train To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator). ## Contributions Thanks to [@ebinum](https://huggingface.co/ebinum) for evaluating this model.
japanese-asr/whisper_transcriptions.reazonspeech.all_51
--- dataset_info: config_name: all features: - name: name dtype: string - name: audio dtype: audio: sampling_rate: 16000 - name: transcription dtype: string - name: whisper_transcript sequence: int64 splits: - name: train num_bytes: 30423889983.0 num_examples: 267358 download_size: 30186859287 dataset_size: 30423889983.0 configs: - config_name: all data_files: - split: train path: all/train-* ---
MohamedSaeed-dev/PyCode
--- license: llama2 ---
ocolegro/ts_train
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 23751416 num_examples: 11414 download_size: 8011655 dataset_size: 23751416 --- # Dataset Card for "ts_train" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
huggingartists/sugar-ray
--- language: - en tags: - huggingartists - lyrics --- # Dataset Card for "huggingartists/sugar-ray" ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [How to use](#how-to-use) - [Dataset Structure](#dataset-structure) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [About](#about) ## Dataset Description - **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists) - **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists) - **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) - **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) - **Size of the generated dataset:** 0.164888 MB <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://images.genius.com/8b5c8fe74f6176047b2b5681e0e0e2d4.273x273x1.jpg&#39;)"> </div> </div> <a href="https://huggingface.co/huggingartists/sugar-ray"> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div> </a> <div style="text-align: center; font-size: 16px; font-weight: 800">Sugar Ray</div> <a href="https://genius.com/artists/sugar-ray"> <div style="text-align: center; font-size: 14px;">@sugar-ray</div> </a> </div> ### Dataset Summary The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists. Model is available [here](https://huggingface.co/huggingartists/sugar-ray). ### Supported Tasks and Leaderboards [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Languages en ## How to use How to load this dataset directly with the datasets library: ```python from datasets import load_dataset dataset = load_dataset("huggingartists/sugar-ray") ``` ## Dataset Structure An example of 'train' looks as follows. ``` This example was too long and was cropped: { "text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..." } ``` ### Data Fields The data fields are the same among all splits. - `text`: a `string` feature. ### Data Splits | train |validation|test| |------:|---------:|---:| |117| -| -| 'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code: ```python from datasets import load_dataset, Dataset, DatasetDict import numpy as np datasets = load_dataset("huggingartists/sugar-ray") train_percentage = 0.9 validation_percentage = 0.07 test_percentage = 0.03 train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))]) datasets = DatasetDict( { 'train': Dataset.from_dict({'text': list(train)}), 'validation': Dataset.from_dict({'text': list(validation)}), 'test': Dataset.from_dict({'text': list(test)}) } ) ``` ## Dataset Creation ### Curation Rationale [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Source Data #### Initial Data Collection and Normalization [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) #### Who are the source language producers? [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Annotations #### Annotation process [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) #### Who are the annotators? [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Personal and Sensitive Information [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Discussion of Biases [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Other Known Limitations [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ## Additional Information ### Dataset Curators [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Licensing Information [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Citation Information ``` @InProceedings{huggingartists, author={Aleksey Korshuk} year=2021 } ``` ## About *Built by Aleksey Korshuk* [![Follow](https://img.shields.io/github/followers/AlekseyKorshuk?style=social)](https://github.com/AlekseyKorshuk) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/AlekseyKorshuk/huggingartists?style=social)](https://github.com/AlekseyKorshuk/huggingartists)
liuyanchen1015/MULTI_VALUE_mnli_regularized_reflexives
--- dataset_info: features: - name: premise dtype: string - name: hypothesis dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: score dtype: int64 splits: - name: dev_matched num_bytes: 19290 num_examples: 94 - name: dev_mismatched num_bytes: 23196 num_examples: 87 - name: test_matched num_bytes: 21638 num_examples: 90 - name: test_mismatched num_bytes: 20725 num_examples: 82 - name: train num_bytes: 938071 num_examples: 3883 download_size: 579285 dataset_size: 1022920 --- # Dataset Card for "MULTI_VALUE_mnli_regularized_reflexives" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
xenon3134-mc/empty-eyes-dataset
--- license: mit size_categories: - n<1K --- A dataset of AI-generated images or images modified from them. Products using this dataset - [empty-eyes-LoRAs](https://huggingface.co/xenon3134-mc/empty-eyes-LoRAs)
atmallen/quirky_popqa_pythia-410m_alice_easy
--- dataset_info: features: - name: id dtype: string - name: choices sequence: string - name: label dtype: int64 - name: popularity dtype: int64 - name: difficulty dtype: float64 - name: statement dtype: string - name: character dtype: string - name: alice_label dtype: bool - name: bob_label dtype: bool - name: bob_log_odds dtype: float64 splits: - name: train num_bytes: 956505.0212765958 num_examples: 6132 - name: validation num_bytes: 72149.154 num_examples: 462 - name: test num_bytes: 76436.57 num_examples: 490 download_size: 402157 dataset_size: 1105090.7452765957 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* ---
neoALI/layout-detector-flagged-samples
--- configs: - config_name: default data_files: - split: train path: data.csv --- # Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
Marcos7fytyg/Dataset.Pain
--- license: apache-2.0 ---
k-seungri/k_whisper_dataset_prepocessing
--- dataset_info: features: - name: input_features sequence: sequence: float32 - name: labels sequence: int64 splits: - name: train num_bytes: 99896792 num_examples: 104 - name: test num_bytes: 13447160 num_examples: 14 - name: valid num_bytes: 12487928 num_examples: 13 download_size: 18434628 dataset_size: 125831880 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* - split: valid path: data/valid-* ---
CATIE-AQ/orange_sum_fr_prompt_text_generation_from_title_of_an_article
--- language: - fr license: cc-by-sa-4.0 size_categories: - 100K<n<1M task_categories: - text-generation tags: - DFP - french prompts annotations_creators: - found language_creators: - found multilinguality: - monolingual source_datasets: - orange_sum --- # orange_sum_fr_prompt_text_generation_from_title_of_an_article ## Summary **orange_sum_fr_prompt_text_generation_from_title_of_an_article** is a subset of the [**Dataset of French Prompts (DFP)**](https://huggingface.co/datasets/CATIE-AQ/DFP). It contains **908,793** rows that can be used for a part-of-speech task. The original data (without prompts) comes from the dataset [orange_sum](https://huggingface.co/datasets/orange_sum) by Eddine et al. A list of prompts (see below) was then applied in order to build the input and target columns and thus obtain the same format as the [xP3](https://huggingface.co/datasets/bigscience/xP3) dataset by Muennighoff et al. ## Prompts used ### List 27 prompts were created for this dataset. The logic applied consists in proposing prompts in the indicative tense, in the form of tutoiement and in the form of vouvoiement. ``` 'Rédiger un texte dont le titre est : "'+title+'".', 'Rédige un texte dont le titre est : "'+title+'".', 'Rédigez un texte dont le titre est : "'+title+'".', 'Rédiger une article dont le titre est : "'+title+'".', 'Rédige un article dont le titre est : "'+title+'".', 'Rédigez un article dont le titre est : "'+title+'".', 'Rédiger un document dont le titre est : "'+title+'".', 'Rédige un document dont le titre est : "'+title+'".', 'Rédigez un document dont le titre est : "'+title+'".', ‘Génèrer un texte dont le titre est : "'+title+'".\nTexte : ', 'Génère un texte dont le titre est : "'+title+'".\nTexte : ', ‘Génèrez un texte dont le titre est : "'+title+'".\nTexte : ', ‘Génèrer un article dont le titre est : "'+title+'".\nArticle : ', ‘Génère un article dont le titre est : "'+title+'".\nArticle : ', ‘Génèrez un article dont le titre est : "'+title+'".\nArticle : ', ‘Génèrer un document dont le titre est : "'+title+'".\nDocument : ', 'Génère un document dont le titre est : "'+title+'".\nDocument : ', ‘Génèrez un document dont le titre est : "'+title+'".\nDocument : ', '"'+title +'"\n Ecrire un texte de 1 à 5 phrases sur le titre précédent : ', '"'+title +'"\n Ecris un texte de 1 à 5 phrases sur le titre précédent : ', '"'+title +'"\n Ecrivez un texte de 1 à 5 phrases sur le titre précédent : ', '"'+title +'"\n Ecrire un article de 1 à 5 phrases sur le titre précédent : ', '"'+title +'"\n Ecris un article de 1 à 5 phrases sur le titre précédent : ', '"'+title +'"\n Ecrivez un article de 1 à 5 phrases sur le titre précédent : ', '"'+title +'"\n Ecrire un document de 1 à 5 phrases sur le titre précédent : ', '"'+title +'"\n Ecris un document de 1 à 5 phrases sur le titre précédent : ', '"'+title +'"\n Ecrivez un document de 1 à 5 phrases sur le titre précédent : ' ``` ### Features used in the prompts In the prompt list above, `title` and `targets` have been constructed from: ``` orange_sum = load_dataset('orange_sum','title') title = orange_sum['train'][i]['summary'] targets = orange_sum['train'][i]['text'] ``` # Splits - `train` with 827,793 samples - `valid` with 40,500 samples - `test` with 40,500 samples # How to use? ``` from datasets import load_dataset dataset = load_dataset("CATIE-AQ/orange_sum_fr_prompt_text_generation_from_title_of_an_article") ``` # Citation ## Original data > @article{eddine2020barthez, title={BARThez: a Skilled Pretrained French Sequence-to-Sequence Model}, author={Eddine, Moussa Kamal and Tixier, Antoine J-P and Vazirgiannis, Michalis}, journal={arXiv preprint arXiv:2010.12321}, year={2020} } ## This Dataset > @misc {centre_aquitain_des_technologies_de_l'information_et_electroniques_2023, author = { {Centre Aquitain des Technologies de l'Information et Electroniques} }, title = { DFP (Revision 1d24c09) }, year = 2023, url = { https://huggingface.co/datasets/CATIE-AQ/DFP }, doi = { 10.57967/hf/1200 }, publisher = { Hugging Face } } ## License CC-BY-SA-4.0
hammondsugar/en-tw
--- license: mit ---
davidho27941/steins_gate_1k_v1.1_conversation
--- dataset_info: features: - name: conversation list: - name: content dtype: string - name: role dtype: string splits: - name: train num_bytes: 115412.4 num_examples: 855 - name: test num_bytes: 12823.6 num_examples: 95 download_size: 77093 dataset_size: 128236.0 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* ---
mrm8488/test2
--- license: wtfpl ---
irds/neumarco_ru_dev_judged
--- pretty_name: '`neumarco/ru/dev/judged`' viewer: false source_datasets: ['irds/neumarco_ru', 'irds/neumarco_ru_dev'] task_categories: - text-retrieval --- # Dataset Card for `neumarco/ru/dev/judged` The `neumarco/ru/dev/judged` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package. For more information about the dataset, see the [documentation](https://ir-datasets.com/neumarco#neumarco/ru/dev/judged). # Data This dataset provides: - `queries` (i.e., topics); count=55,578 - For `docs`, use [`irds/neumarco_ru`](https://huggingface.co/datasets/irds/neumarco_ru) - For `qrels`, use [`irds/neumarco_ru_dev`](https://huggingface.co/datasets/irds/neumarco_ru_dev) ## Usage ```python from datasets import load_dataset queries = load_dataset('irds/neumarco_ru_dev_judged', 'queries') for record in queries: record # {'query_id': ..., 'text': ...} ``` Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the data in 🤗 Dataset format.
Yuhthe/samsum
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* - split: validation path: data/validation-* dataset_info: features: - name: id dtype: string - name: dialogue dtype: string - name: summary dtype: string splits: - name: train num_bytes: 9479117 num_examples: 14732 - name: test num_bytes: 534480 num_examples: 819 - name: validation num_bytes: 516419 num_examples: 818 download_size: 6737195 dataset_size: 10530016 task_categories: - summarization language: - vi --- # Dataset Card for "samsum" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
skrishna/SeqSense_mcq_8
--- dataset_info: features: - name: input dtype: string - name: answer dtype: string splits: - name: train num_bytes: 24480 num_examples: 300 download_size: 8809 dataset_size: 24480 --- # Dataset Card for "SeqSense_mcq_8" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
eedu/luangalinha
--- license: openrail ---
HPalaciosMi/sentiment-banking
--- dataset_info: features: - name: text dtype: string - name: inputs struct: - name: text dtype: string - name: prediction list: - name: label dtype: string - name: score dtype: float64 - name: prediction_agent dtype: string - name: annotation dtype: 'null' - name: annotation_agent dtype: 'null' - name: vectors dtype: 'null' - name: multi_label dtype: bool - name: explanation dtype: 'null' - name: id dtype: string - name: metadata struct: - name: category dtype: int64 - name: status dtype: string - name: event_timestamp dtype: timestamp[us] - name: metrics dtype: 'null' splits: - name: train num_bytes: 1445808 num_examples: 5001 download_size: 642951 dataset_size: 1445808 configs: - config_name: default data_files: - split: train path: data/train-* ---
CyberHarem/haguro_azurlane
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of haguro/羽黒/羽黑 (Azur Lane) This is the dataset of haguro/羽黒/羽黑 (Azur Lane), containing 11 images and their tags. The core tags of this character are `black_hair, hair_ornament, red_eyes, bangs, earrings, hairclip, breasts, ear_piercing, multicolored_hair, hair_between_eyes, streaked_hair, ponytail, white_hair`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 11 | 16.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haguro_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 11 | 8.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haguro_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 28 | 19.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haguro_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 11 | 13.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haguro_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 28 | 28.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haguro_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/haguro_azurlane', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | midriff, 1girl, black_choker, crop_top, jewelry, navel, solo, black_shirt, looking_at_viewer, piercing, short_sleeves, pleated_skirt, belt, black_serafuku, black_skirt, stomach, black_nails, black_sailor_collar, blush, closed_mouth, collarbone, holding, nail_polish, purple_neckerchief, sitting | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | midriff | 1girl | black_choker | crop_top | jewelry | navel | solo | black_shirt | looking_at_viewer | piercing | short_sleeves | pleated_skirt | belt | black_serafuku | black_skirt | stomach | black_nails | black_sailor_collar | blush | closed_mouth | collarbone | holding | nail_polish | purple_neckerchief | sitting | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------|:--------|:---------------|:-----------|:----------|:--------|:-------|:--------------|:--------------------|:-----------|:----------------|:----------------|:-------|:-----------------|:--------------|:----------|:--------------|:----------------------|:--------|:---------------|:-------------|:----------|:--------------|:---------------------|:----------| | 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
KolyaForger/mangatest
--- license: afl-3.0 ---
open-llm-leaderboard/details_nasiruddin15__Mistral-dolphin-2.8-grok-instract-2-7B-slerp
--- pretty_name: Evaluation run of nasiruddin15/Mistral-dolphin-2.8-grok-instract-2-7B-slerp dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [nasiruddin15/Mistral-dolphin-2.8-grok-instract-2-7B-slerp](https://huggingface.co/nasiruddin15/Mistral-dolphin-2.8-grok-instract-2-7B-slerp)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nasiruddin15__Mistral-dolphin-2.8-grok-instract-2-7B-slerp\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-04-03T01:59:25.249541](https://huggingface.co/datasets/open-llm-leaderboard/details_nasiruddin15__Mistral-dolphin-2.8-grok-instract-2-7B-slerp/blob/main/results_2024-04-03T01-59-25.249541.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.629995379055331,\n\ \ \"acc_stderr\": 0.032612799341556774,\n \"acc_norm\": 0.6338419302451002,\n\ \ \"acc_norm_stderr\": 0.03326393949397101,\n \"mc1\": 0.3574051407588739,\n\ \ \"mc1_stderr\": 0.0167765996767294,\n \"mc2\": 0.5173783767474156,\n\ \ \"mc2_stderr\": 0.015444454142990593\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5964163822525598,\n \"acc_stderr\": 0.014337158914268438,\n\ \ \"acc_norm\": 0.6390784982935154,\n \"acc_norm_stderr\": 0.014034761386175452\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6509659430392352,\n\ \ \"acc_stderr\": 0.004756905819649977,\n \"acc_norm\": 0.8441545508862777,\n\ \ \"acc_norm_stderr\": 0.003619674864035018\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\ \ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\ \ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\ \ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\ \ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \ \ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n\ \ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\ \ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\ \ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \ \ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\ : 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\ \ \"acc_stderr\": 0.037143259063020656,\n \"acc_norm\": 0.6127167630057804,\n\ \ \"acc_norm_stderr\": 0.037143259063020656\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\ \ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n\ \ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n\ \ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\ \ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\ \ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\ \ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406783,\n \"\ acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406783\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\ \ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\ \ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \ \ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.7290322580645161,\n \"acc_stderr\": 0.025284416114900156,\n \"\ acc_norm\": 0.7290322580645161,\n \"acc_norm_stderr\": 0.025284416114900156\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n \"\ acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\ : 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\ \ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\ : 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\ \ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \ \ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n\ \ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6256410256410256,\n \"acc_stderr\": 0.024537591572830506,\n\ \ \"acc_norm\": 0.6256410256410256,\n \"acc_norm_stderr\": 0.024537591572830506\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815642,\n \ \ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815642\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059295,\n\ \ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059295\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\ acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8201834862385321,\n \"acc_stderr\": 0.01646534546739152,\n \"\ acc_norm\": 0.8201834862385321,\n \"acc_norm_stderr\": 0.01646534546739152\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\ acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8235294117647058,\n \"acc_stderr\": 0.02675640153807897,\n \"\ acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.02675640153807897\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \ \ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\ \ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\ \ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\ \ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990948,\n \"\ acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990948\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\ \ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\ \ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n\ \ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n\ \ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n\ \ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\ \ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\ \ \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n\ \ \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \ \ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n\ \ \"acc_stderr\": 0.01396439376989913,\n \"acc_norm\": 0.8122605363984674,\n\ \ \"acc_norm_stderr\": 0.01396439376989913\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n\ \ \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43575418994413406,\n\ \ \"acc_stderr\": 0.016583881958602394,\n \"acc_norm\": 0.43575418994413406,\n\ \ \"acc_norm_stderr\": 0.016583881958602394\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n\ \ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\ \ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\ \ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.025329888171900926,\n\ \ \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.025329888171900926\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4716312056737589,\n \"acc_stderr\": 0.02977945095730307,\n \ \ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.02977945095730307\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4491525423728814,\n\ \ \"acc_stderr\": 0.012704030518851488,\n \"acc_norm\": 0.4491525423728814,\n\ \ \"acc_norm_stderr\": 0.012704030518851488\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.02850145286039655,\n\ \ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.02850145286039655\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \ \ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\ \ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\ \ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\ \ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7860696517412935,\n\ \ \"acc_stderr\": 0.02899690969332891,\n \"acc_norm\": 0.7860696517412935,\n\ \ \"acc_norm_stderr\": 0.02899690969332891\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \ \ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\ \ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\ \ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\ \ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3574051407588739,\n\ \ \"mc1_stderr\": 0.0167765996767294,\n \"mc2\": 0.5173783767474156,\n\ \ \"mc2_stderr\": 0.015444454142990593\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7821625887924231,\n \"acc_stderr\": 0.011601066079939324\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.48673237300985595,\n \ \ \"acc_stderr\": 0.013767635127026322\n }\n}\n```" repo_url: https://huggingface.co/nasiruddin15/Mistral-dolphin-2.8-grok-instract-2-7B-slerp leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|arc:challenge|25_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-04-03T01-59-25.249541.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|gsm8k|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hellaswag|10_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-management|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-management|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-04-03T01-59-25.249541.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-international_law|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-management|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-marketing|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-sociology|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-virology|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T01-59-25.249541.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|truthfulqa:mc|0_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-04-03T01-59-25.249541.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_04_03T01_59_25.249541 path: - '**/details_harness|winogrande|5_2024-04-03T01-59-25.249541.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-04-03T01-59-25.249541.parquet' - config_name: results data_files: - split: 2024_04_03T01_59_25.249541 path: - results_2024-04-03T01-59-25.249541.parquet - split: latest path: - results_2024-04-03T01-59-25.249541.parquet --- # Dataset Card for Evaluation run of nasiruddin15/Mistral-dolphin-2.8-grok-instract-2-7B-slerp <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [nasiruddin15/Mistral-dolphin-2.8-grok-instract-2-7B-slerp](https://huggingface.co/nasiruddin15/Mistral-dolphin-2.8-grok-instract-2-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_nasiruddin15__Mistral-dolphin-2.8-grok-instract-2-7B-slerp", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-04-03T01:59:25.249541](https://huggingface.co/datasets/open-llm-leaderboard/details_nasiruddin15__Mistral-dolphin-2.8-grok-instract-2-7B-slerp/blob/main/results_2024-04-03T01-59-25.249541.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.629995379055331, "acc_stderr": 0.032612799341556774, "acc_norm": 0.6338419302451002, "acc_norm_stderr": 0.03326393949397101, "mc1": 0.3574051407588739, "mc1_stderr": 0.0167765996767294, "mc2": 0.5173783767474156, "mc2_stderr": 0.015444454142990593 }, "harness|arc:challenge|25": { "acc": 0.5964163822525598, "acc_stderr": 0.014337158914268438, "acc_norm": 0.6390784982935154, "acc_norm_stderr": 0.014034761386175452 }, "harness|hellaswag|10": { "acc": 0.6509659430392352, "acc_stderr": 0.004756905819649977, "acc_norm": 0.8441545508862777, "acc_norm_stderr": 0.003619674864035018 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252606, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252606 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.04203921040156279, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.04203921040156279 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6776315789473685, "acc_stderr": 0.03803510248351585, "acc_norm": 0.6776315789473685, "acc_norm_stderr": 0.03803510248351585 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.59, "acc_stderr": 0.04943110704237102, "acc_norm": 0.59, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6754716981132075, "acc_stderr": 0.02881561571343211, "acc_norm": 0.6754716981132075, "acc_norm_stderr": 0.02881561571343211 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7013888888888888, "acc_stderr": 0.03827052357950756, "acc_norm": 0.7013888888888888, "acc_norm_stderr": 0.03827052357950756 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6127167630057804, "acc_stderr": 0.037143259063020656, "acc_norm": 0.6127167630057804, "acc_norm_stderr": 0.037143259063020656 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4019607843137255, "acc_stderr": 0.04878608714466996, "acc_norm": 0.4019607843137255, "acc_norm_stderr": 0.04878608714466996 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.04229525846816505, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816505 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5404255319148936, "acc_stderr": 0.03257901482099835, "acc_norm": 0.5404255319148936, "acc_norm_stderr": 0.03257901482099835 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.43859649122807015, "acc_stderr": 0.04668000738510455, "acc_norm": 0.43859649122807015, "acc_norm_stderr": 0.04668000738510455 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5793103448275863, "acc_stderr": 0.0411391498118926, "acc_norm": 0.5793103448275863, "acc_norm_stderr": 0.0411391498118926 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42328042328042326, "acc_stderr": 0.025446365634406783, "acc_norm": 0.42328042328042326, "acc_norm_stderr": 0.025446365634406783 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04426266681379909, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04426266681379909 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7290322580645161, "acc_stderr": 0.025284416114900156, "acc_norm": 0.7290322580645161, "acc_norm_stderr": 0.025284416114900156 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.49261083743842365, "acc_stderr": 0.03517603540361008, "acc_norm": 0.49261083743842365, "acc_norm_stderr": 0.03517603540361008 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7636363636363637, "acc_stderr": 0.03317505930009181, "acc_norm": 0.7636363636363637, "acc_norm_stderr": 0.03317505930009181 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.797979797979798, "acc_stderr": 0.02860620428922987, "acc_norm": 0.797979797979798, "acc_norm_stderr": 0.02860620428922987 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8601036269430051, "acc_stderr": 0.025033870583015178, "acc_norm": 0.8601036269430051, "acc_norm_stderr": 0.025033870583015178 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6256410256410256, "acc_stderr": 0.024537591572830506, "acc_norm": 0.6256410256410256, "acc_norm_stderr": 0.024537591572830506 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3074074074074074, "acc_stderr": 0.028133252578815642, "acc_norm": 0.3074074074074074, "acc_norm_stderr": 0.028133252578815642 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6470588235294118, "acc_stderr": 0.031041941304059295, "acc_norm": 0.6470588235294118, "acc_norm_stderr": 0.031041941304059295 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.038796870240733264, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.038796870240733264 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8201834862385321, "acc_stderr": 0.01646534546739152, "acc_norm": 0.8201834862385321, "acc_norm_stderr": 0.01646534546739152 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.49537037037037035, "acc_stderr": 0.03409825519163572, "acc_norm": 0.49537037037037035, "acc_norm_stderr": 0.03409825519163572 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8235294117647058, "acc_stderr": 0.02675640153807897, "acc_norm": 0.8235294117647058, "acc_norm_stderr": 0.02675640153807897 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7932489451476793, "acc_stderr": 0.026361651668389094, "acc_norm": 0.7932489451476793, "acc_norm_stderr": 0.026361651668389094 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6681614349775785, "acc_stderr": 0.03160295143776679, "acc_norm": 0.6681614349775785, "acc_norm_stderr": 0.03160295143776679 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8016528925619835, "acc_stderr": 0.03640118271990948, "acc_norm": 0.8016528925619835, "acc_norm_stderr": 0.03640118271990948 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.0401910747255735, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.0401910747255735 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7177914110429447, "acc_stderr": 0.03536117886664742, "acc_norm": 0.7177914110429447, "acc_norm_stderr": 0.03536117886664742 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5357142857142857, "acc_stderr": 0.04733667890053756, "acc_norm": 0.5357142857142857, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.7475728155339806, "acc_stderr": 0.04301250399690878, "acc_norm": 0.7475728155339806, "acc_norm_stderr": 0.04301250399690878 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8717948717948718, "acc_stderr": 0.021901905115073325, "acc_norm": 0.8717948717948718, "acc_norm_stderr": 0.021901905115073325 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8122605363984674, "acc_stderr": 0.01396439376989913, "acc_norm": 0.8122605363984674, "acc_norm_stderr": 0.01396439376989913 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6936416184971098, "acc_stderr": 0.024818350129436593, "acc_norm": 0.6936416184971098, "acc_norm_stderr": 0.024818350129436593 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.43575418994413406, "acc_stderr": 0.016583881958602394, "acc_norm": 0.43575418994413406, "acc_norm_stderr": 0.016583881958602394 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7156862745098039, "acc_stderr": 0.025829163272757482, "acc_norm": 0.7156862745098039, "acc_norm_stderr": 0.025829163272757482 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7106109324758842, "acc_stderr": 0.025755865922632945, "acc_norm": 0.7106109324758842, "acc_norm_stderr": 0.025755865922632945 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7067901234567902, "acc_stderr": 0.025329888171900926, "acc_norm": 0.7067901234567902, "acc_norm_stderr": 0.025329888171900926 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4716312056737589, "acc_stderr": 0.02977945095730307, "acc_norm": 0.4716312056737589, "acc_norm_stderr": 0.02977945095730307 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4491525423728814, "acc_stderr": 0.012704030518851488, "acc_norm": 0.4491525423728814, "acc_norm_stderr": 0.012704030518851488 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6727941176470589, "acc_stderr": 0.02850145286039655, "acc_norm": 0.6727941176470589, "acc_norm_stderr": 0.02850145286039655 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6683006535947712, "acc_stderr": 0.01904748523936038, "acc_norm": 0.6683006535947712, "acc_norm_stderr": 0.01904748523936038 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7224489795918367, "acc_stderr": 0.028666857790274648, "acc_norm": 0.7224489795918367, "acc_norm_stderr": 0.028666857790274648 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7860696517412935, "acc_stderr": 0.02899690969332891, "acc_norm": 0.7860696517412935, "acc_norm_stderr": 0.02899690969332891 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774708, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774708 }, "harness|hendrycksTest-virology|5": { "acc": 0.5240963855421686, "acc_stderr": 0.03887971849597264, "acc_norm": 0.5240963855421686, "acc_norm_stderr": 0.03887971849597264 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.3574051407588739, "mc1_stderr": 0.0167765996767294, "mc2": 0.5173783767474156, "mc2_stderr": 0.015444454142990593 }, "harness|winogrande|5": { "acc": 0.7821625887924231, "acc_stderr": 0.011601066079939324 }, "harness|gsm8k|5": { "acc": 0.48673237300985595, "acc_stderr": 0.013767635127026322 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
huggingartists/nirvana
--- language: - en tags: - huggingartists - lyrics --- # Dataset Card for "huggingartists/nirvana" ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [How to use](#how-to-use) - [Dataset Structure](#dataset-structure) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) ## Dataset Description - **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists) - **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists) - **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) - **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) - **Size of the generated dataset:** 0.336531 MB <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://images.genius.com/4c1373962cfc3a668a3e30da9a76a34c.640x640x1.jpg&#39;)"> </div> </div> <a href="https://huggingface.co/huggingartists/nirvana"> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div> </a> <div style="text-align: center; font-size: 16px; font-weight: 800">Nirvana</div> <a href="https://genius.com/artists/nirvana"> <div style="text-align: center; font-size: 14px;">@nirvana</div> </a> </div> ### Dataset Summary The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists. Model is available [here](https://huggingface.co/huggingartists/nirvana). ### Supported Tasks and Leaderboards [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Languages en ## How to use How to load this dataset directly with the datasets library: ```python from datasets import load_dataset dataset = load_dataset("huggingartists/nirvana") ``` ## Dataset Structure An example of 'train' looks as follows. ``` This example was too long and was cropped: { "text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..." } ``` ### Data Fields The data fields are the same among all splits. - `text`: a `string` feature. ### Data Splits | train |validation|test| |------:|---------:|---:| |TRAIN_0.336531| -| -| 'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code: ```python from datasets import load_dataset, Dataset, DatasetDict import numpy as np datasets = load_dataset("huggingartists/nirvana") train_percentage = 0.9 validation_percentage = 0.07 test_percentage = 0.03 train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))]) datasets = DatasetDict( { 'train': Dataset.from_dict({'text': list(train)}), 'validation': Dataset.from_dict({'text': list(validation)}), 'test': Dataset.from_dict({'text': list(test)}) } ) ``` ## Dataset Creation ### Curation Rationale [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Source Data #### Initial Data Collection and Normalization [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) #### Who are the source language producers? [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Annotations #### Annotation process [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) #### Who are the annotators? [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Personal and Sensitive Information [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Discussion of Biases [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Other Known Limitations [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ## Additional Information ### Dataset Curators [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Licensing Information [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Citation Information ``` @InProceedings{huggingartists, author={Aleksey Korshuk} year=2021 } ```
Eric33/MyDataset_project_1
--- license: gpl ---
liuyanchen1015/MULTI_VALUE_sst2_corr_conjunction_doubling
--- dataset_info: features: - name: sentence dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: score dtype: int64 splits: - name: dev num_bytes: 12080 num_examples: 71 - name: test num_bytes: 27503 num_examples: 161 - name: train num_bytes: 234657 num_examples: 1414 download_size: 153416 dataset_size: 274240 --- # Dataset Card for "MULTI_VALUE_sst2_corr_conjunction_doubling" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
iamkaikai/CUBISM-ART
--- dataset_info: features: - name: image dtype: image - name: text dtype: string splits: - name: train num_bytes: 15223404.0 num_examples: 325 download_size: 15207055 dataset_size: 15223404.0 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "CUBISM-ART" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
liblinear/russian-paintings-t2i
--- dataset_info: features: - name: image dtype: image - name: text dtype: string splits: - name: train num_bytes: 165781456.64100003 num_examples: 1503 download_size: 165228421 dataset_size: 165781456.64100003 configs: - config_name: default data_files: - split: train path: data/train-* ---
BigTMiami/amazon_25M_500_000_condensed
--- dataset_info: features: - name: input_ids sequence: int32 - name: attention_mask sequence: int8 - name: labels sequence: int64 splits: - name: train num_bytes: 559771932 num_examples: 83949 - name: validation num_bytes: 55611120 num_examples: 8340 download_size: 196115236 dataset_size: 615383052 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* ---
Partha117/oss_bugs
--- dataset_info: features: - name: status dtype: string - name: repo_name dtype: string - name: repo_url dtype: string - name: issue_id dtype: int64 - name: updated_files dtype: string - name: title dtype: string - name: body dtype: string - name: issue_url dtype: string - name: pull_url dtype: string - name: before_fix_sha dtype: string - name: after_fix_sha dtype: string - name: report_datetime dtype: timestamp[ns, tz=UTC] - name: language dtype: string - name: commit_datetime dtype: timestamp[us, tz=UTC] splits: - name: train num_bytes: 78218675 num_examples: 26321 download_size: 27477501 dataset_size: 78218675 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "oss_bugs" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
liuyanchen1015/MULTI_VALUE_wnli_his_he
--- dataset_info: features: - name: sentence1 dtype: string - name: sentence2 dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: value_score dtype: int64 splits: - name: dev num_bytes: 3017 num_examples: 12 - name: test num_bytes: 8421 num_examples: 28 - name: train num_bytes: 27142 num_examples: 125 download_size: 22059 dataset_size: 38580 --- # Dataset Card for "MULTI_VALUE_wnli_his_he" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
CVasNLPExperiments/OK-VQA_test_google_flan_t5_xl_mode_CM_Q_rices_ns_5046
--- dataset_info: features: - name: id dtype: int64 - name: prompt sequence: string - name: question dtype: string - name: true_label sequence: string - name: prediction dtype: string splits: - name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large_clean_ num_bytes: 27657536 num_examples: 5046 download_size: 5163191 dataset_size: 27657536 --- # Dataset Card for "OK-VQA_test_google_flan_t5_xl_mode_CM_Q_rices_ns_5046" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
abertsch/booksum-fullbooks
--- dataset_info: features: - name: bid dtype: string - name: source dtype: string - name: title dtype: string - name: summary dtype: string - name: book dtype: string splits: - name: validation num_bytes: 23586559 num_examples: 45 - name: train num_bytes: 165182724 num_examples: 314 - name: test num_bytes: 31094987 num_examples: 46 download_size: 60336046 dataset_size: 219864270 --- # Dataset Card for "booksum-fullbooks" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/d27cefa1
--- dataset_info: features: - name: result dtype: string - name: id dtype: int64 splits: - name: train num_bytes: 188 num_examples: 10 download_size: 1341 dataset_size: 188 --- # Dataset Card for "d27cefa1" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Saviourscs/Article_Review
--- license: apache-2.0 dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 551346 num_examples: 460 download_size: 318366 dataset_size: 551346 configs: - config_name: default data_files: - split: train path: data/train-* ---
NickyNicky/aya_dataset_multilingual_inputs_targets_ext9
--- dataset_info: features: - name: inputs dtype: string - name: targets dtype: string - name: language dtype: string - name: language_code dtype: string - name: targets_es dtype: string - name: targets_en dtype: string - name: targets_fr dtype: string - name: targets_de dtype: string - name: inputs_es dtype: string - name: inputs_en dtype: string - name: inputs_fr dtype: string - name: inputs_de dtype: string - name: __index_level_0__ dtype: int64 splits: - name: train num_bytes: 3147601 num_examples: 1000 download_size: 2019422 dataset_size: 3147601 configs: - config_name: default data_files: - split: train path: data/train-* ---
dshut002/Mermaid_LLAMA
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: instruction dtype: string - name: input dtype: string - name: output dtype: string splits: - name: train num_bytes: 503 num_examples: 1 download_size: 4922 dataset_size: 503 --- # Dataset Card for "Mermaid_LLAMA" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
aswin1906/countries-inflation
--- license: apache-2.0 task_categories: - tabular-regression - text-classification - text-generation language: - en pretty_name: Countries by Inflation rate of 2022 size_categories: - n<1K --- # Dataset Summary Inflation is a critical economic indicator that reflects the overall increase in prices of goods and services within an economy over a specific period. Understanding inflation trends on a global scale is crucial for economists, policymakers, investors, and businesses. This dataset provides comprehensive insights into the inflation rates of various countries for the year 2022. The data is sourced from reputable international organizations and government reports, making it a valuable resource for economic analysis and research. This dataset includes four essential columns: 1. Countries: The names of countries for which inflation data is recorded. Each row represents a specific country. 1. Inflation, 2022: The inflation rate for each country in the year 2022. Inflation rates are typically expressed as a percentage and indicate the average increase in prices for that year. 1. Global Rank: The rank of each country based on its inflation rate in 2022. Countries with the highest inflation rates will have a lower rank, while those with lower inflation rates will have a higher rank. 1. Available Data: A binary indicator (Yes/No) denoting whether complete and reliable data for inflation in 2022 is available for a particular country. This column helps users identify the data quality and coverage. ## Potential Use Cases **Economic Analysis:** Researchers and economists can use this dataset to analyze inflation trends globally, identify countries with high or low inflation rates, and make comparisons across regions. **Investment Decisions:** Investors and financial analysts can incorporate inflation data into their risk assessments and investment strategies. **Business Planning:** Companies operating in multiple countries can assess the impact of inflation on their costs and pricing strategies, helping them make informed decisions. ## Data Accuracy: Efforts have been made to ensure the accuracy and reliability of the data; however, users are encouraged to cross-reference this dataset with official sources for critical decision-making processes. ## Updates: This dataset will be periodically updated to include the latest available inflation data, making it an ongoing resource for tracking global inflation trends.
timm/imagenet-22k-wds
--- license: other license_name: imagenet license_link: https://www.image-net.org/download.php task_categories: - image-classification pretty_name: ImageNet-22k size_categories: - 10M<n<100M extra_gated_prompt: >- By clicking on “Access repository” below, you also agree to ImageNet Terms of Access: [RESEARCHER_FULLNAME] (the "Researcher") has requested permission to use the ImageNet database (the "Database") at Princeton University and Stanford University. In exchange for such permission, Researcher hereby agrees to the following terms and conditions: 1. Researcher shall use the Database only for non-commercial research and educational purposes. 2. Princeton University, Stanford University and Hugging Face make no representations or warranties regarding the Database, including but not limited to warranties of non-infringement or fitness for a particular purpose. 3. Researcher accepts full responsibility for his or her use of the Database and shall defend and indemnify the ImageNet team, Princeton University, Stanford University and Hugging Face, including their employees, Trustees, officers and agents, against any and all claims arising from Researcher's use of the Database, including but not limited to Researcher's use of any copies of copyrighted images that he or she may create from the Database. 4. Researcher may provide research associates and colleagues with access to the Database provided that they first agree to be bound by these terms and conditions. 5. Princeton University, Stanford University and Hugging Face reserve the right to terminate Researcher's access to the Database at any time. 6. If Researcher is employed by a for-profit, commercial entity, Researcher's employer shall also be bound by these terms and conditions, and Researcher hereby represents that he or she is fully authorized to enter into this agreement on behalf of such employer. 7. The law of the State of New Jersey shall apply to all disputes under this agreement. tags: - webdataset --- ## Dataset Description - **Homepage:** https://image-net.org/index.php - **Repository:** https://github.com/rwightman/imagenet-12k - **Paper:** https://arxiv.org/abs/1409.0575 ### Dataset Summary This is a copy of the full [ImageNet](https://www.image-net.org/) dataset consisting of all of the original 21841 clases. It also contains labels in a separate field for the '12k' subset described at at (https://github.com/rwightman/imagenet-12k, https://huggingface.co/datasets/timm/imagenet-12k-wds) This dataset is from the original `fall11` ImageNet release which has been replaced by the `winter21` release which removes close to 3000 synsets containing people, a number of these are of an offensive or sensitive nature. There is work in progress to filter a similar dataset from `winter21`, and there is already [ImageNet-21k-P](https://github.com/Alibaba-MIIL/ImageNet21K/blob/main/dataset_preprocessing/processing_instructions.md) but with different thresholds & preprocessing steps. ### Data Splits Unlike ImageNet-1k (ILSVRC 2012), the full ImageNet dataset has no defined splits. This instance does include a randomly selected validation split consiting of 40 samples for the 11821 classes in ImageNet-12k. The validation split is the exact same as https://huggingface.co/datasets/timm/imagenet-12k-wds and does not fully cover all 22k classes. Beyond the 12k classes (sorted by # samples), the remaining have very few samples per-class. ImageNet-22k is not a balanced dataset. #### Train * `imagenet22k-train-{0000..4095}.tar` * 13673551 samples over 4095 shards #### Validation * `imagenet22k-validation-{0000..0511}.tar` * 472840 samples over 512 shards ### Processing I performed some processing while sharding this dataset: * All exif tags not related to color space were removed * All images with width or height < 48 were removed. * All images with the smallest edge > 600 were resized, maintaining aspect so that they were = 600. Improving size & decoding time uniformity for typical pretrain use cases. * Images were pre-shuffled across the shards ## Additional Information ### Dataset Curators Authors of [[1]](https://arxiv.org/abs/1409.0575) and [[2]](https://ieeexplore.ieee.org/abstract/document/5206848): - Olga Russakovsky - Jia Deng - Hao Su - Jonathan Krause - Sanjeev Satheesh - Wei Dong - Richard Socher - Li-Jia Li - Kai Li - Sean Ma - Zhiheng Huang - Andrej Karpathy - Aditya Khosla - Michael Bernstein - Alexander C Berg - Li Fei-Fei ### Licensing Information In exchange for permission to use the ImageNet database (the "Database") at Princeton University and Stanford University, Researcher hereby agrees to the following terms and conditions: 1. Researcher shall use the Database only for non-commercial research and educational purposes. 1. Princeton University and Stanford University make no representations or warranties regarding the Database, including but not limited to warranties of non-infringement or fitness for a particular purpose. 1. Researcher accepts full responsibility for his or her use of the Database and shall defend and indemnify the ImageNet team, Princeton University, and Stanford University, including their employees, Trustees, officers and agents, against any and all claims arising from Researcher's use of the Database, including but not limited to Researcher's use of any copies of copyrighted images that he or she may create from the Database. 1. Researcher may provide research associates and colleagues with access to the Database provided that they first agree to be bound by these terms and conditions. 1. Princeton University and Stanford University reserve the right to terminate Researcher's access to the Database at any time. 1. If Researcher is employed by a for-profit, commercial entity, Researcher's employer shall also be bound by these terms and conditions, and Researcher hereby represents that he or she is fully authorized to enter into this agreement on behalf of such employer. 1. The law of the State of New Jersey shall apply to all disputes under this agreement. ### Citation Information ```bibtex @article{imagenet15russakovsky, Author = {Olga Russakovsky and Jia Deng and Hao Su and Jonathan Krause and Sanjeev Satheesh and Sean Ma and Zhiheng Huang and Andrej Karpathy and Aditya Khosla and Michael Bernstein and Alexander C. Berg and Li Fei-Fei}, Title = { {ImageNet Large Scale Visual Recognition Challenge} }, Year = {2015}, journal = {International Journal of Computer Vision (IJCV)}, doi = {10.1007/s11263-015-0816-y}, volume={115}, number={3}, pages={211-252} } ```
DUOMO-Lab/TransGPT-sft
--- license: apache-2.0 ---
namphan410/Test
--- license: unknown ---
erickdp/autotrain-data-tweet-es-sent
--- task_categories: - text-classification --- # AutoTrain Dataset for project: tweet-es-sent ## Dataset Description This dataset has been automatically processed by AutoTrain for project tweet-es-sent. ### Languages The BCP-47 code for the dataset's language is unk. ## Dataset Structure ### Data Instances A sample from this dataset looks as follows: ```json [ { "target": 1, "text": "1sola vuelta! arauz presidente! 1sola vuelta! todo 1 1sola la 1 es ecdor! por ti!1 por 1 los tuyos!1 por nosotros juntos1 mas de 45 d apoyo popular el 7 se vota 1por la vida por el futuro,por la esperanza guayaquil ec dor es 1" }, { "target": 1, "text": "excelente decisi\u00f3n , las mujeres son importantes y por esa raz\u00f3n, a productos de primera necesidad hay que quitarles el iva " } ] ``` ### Dataset Fields The dataset has the following fields (also called "features"): ```json { "target": "ClassLabel(num_classes=3, names=['0', '1', '2'], id=None)", "text": "Value(dtype='string', id=None)" } ``` ### Dataset Splits This dataset is split into a train and validation split. The split sizes are as follow: | Split name | Num samples | | ------------ | ------------------- | | train | 12400 | | valid | 3685 |
sharad36/beat
--- license: afl-3.0 ---
dgblife/detection_clp
--- license: other ---
open-llm-leaderboard/details_TwT-6__open_llm_leaderboard_demo2
--- pretty_name: Evaluation run of TwT-6/open_llm_leaderboard_demo2 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [TwT-6/open_llm_leaderboard_demo2](https://huggingface.co/TwT-6/open_llm_leaderboard_demo2)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TwT-6__open_llm_leaderboard_demo2\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-04-15T16:29:36.546610](https://huggingface.co/datasets/open-llm-leaderboard/details_TwT-6__open_llm_leaderboard_demo2/blob/main/results_2024-04-15T16-29-36.546610.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6463960806225129,\n\ \ \"acc_stderr\": 0.03156272465458696,\n \"acc_norm\": 0.6590262499023557,\n\ \ \"acc_norm_stderr\": 0.03241582471085193,\n \"mc1\": 0.3659730722154223,\n\ \ \"mc1_stderr\": 0.016862941684088376,\n \"mc2\": 0.5249573452382528,\n\ \ \"mc2_stderr\": 0.015222313755138895\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5750853242320819,\n \"acc_stderr\": 0.014445698968520767,\n\ \ \"acc_norm\": 0.6237201365187713,\n \"acc_norm_stderr\": 0.014157022555407156\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6414060944035053,\n\ \ \"acc_stderr\": 0.004786075107572191,\n \"acc_norm\": 0.8375821549492133,\n\ \ \"acc_norm_stderr\": 0.0036807989505319135\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\ \ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\ \ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\ \ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03583496176361074,\n\ \ \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03583496176361074\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.69,\n\ \ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \ \ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\ \ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\ \ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n\ \ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \ \ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\ : 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\ \ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\ \ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\ \ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663434,\n\ \ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663434\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n\ \ \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\ \ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\ \ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\ \ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\ \ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.4656084656084656,\n \"acc_stderr\": 0.025690321762493838,\n \"\ acc_norm\": 0.4656084656084656,\n \"acc_norm_stderr\": 0.025690321762493838\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\ \ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\ \ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \ \ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.8064516129032258,\n \"acc_stderr\": 0.022475258525536057,\n \"\ acc_norm\": 0.8064516129032258,\n \"acc_norm_stderr\": 0.022475258525536057\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876105,\n \"\ acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876105\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\ : 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n\ \ \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.8080808080808081,\n \"acc_stderr\": 0.02805779167298902,\n \"\ acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.02805779167298902\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n\ \ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\ \ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857406,\n \ \ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857406\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \ \ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\ acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461763,\n \"\ acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461763\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5833333333333334,\n \"acc_stderr\": 0.033622774366080424,\n \"\ acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.033622774366080424\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8676470588235294,\n \"acc_stderr\": 0.023784297520918853,\n \"\ acc_norm\": 0.8676470588235294,\n \"acc_norm_stderr\": 0.023784297520918853\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.8438818565400844,\n \"acc_stderr\": 0.023627159460318677,\n \ \ \"acc_norm\": 0.8438818565400844,\n \"acc_norm_stderr\": 0.023627159460318677\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\ \ \"acc_stderr\": 0.03102441174057222,\n \"acc_norm\": 0.6905829596412556,\n\ \ \"acc_norm_stderr\": 0.03102441174057222\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n\ \ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990945,\n \"\ acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990945\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\ \ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\ \ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\ \ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\ \ \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n\ \ \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n\ \ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\ \ \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n\ \ \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \ \ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n\ \ \"acc_stderr\": 0.013468201614066302,\n \"acc_norm\": 0.8288633461047255,\n\ \ \"acc_norm_stderr\": 0.013468201614066302\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.02353292543104429,\n\ \ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.02353292543104429\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38324022346368714,\n\ \ \"acc_stderr\": 0.016260159604429128,\n \"acc_norm\": 0.38324022346368714,\n\ \ \"acc_norm_stderr\": 0.016260159604429128\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729487,\n\ \ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729487\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\ \ \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n\ \ \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135118,\n\ \ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135118\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\ : 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\ : {\n \"acc\": 0.48891786179921776,\n \"acc_stderr\": 0.012767098998525834,\n\ \ \"acc_norm\": 0.48891786179921776,\n \"acc_norm_stderr\": 0.012767098998525834\n\ \ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\ : 0.7426470588235294,\n \"acc_stderr\": 0.026556519470041496,\n \"\ acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.026556519470041496\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6879084967320261,\n \"acc_stderr\": 0.01874501120127766,\n \ \ \"acc_norm\": 0.6879084967320261,\n \"acc_norm_stderr\": 0.01874501120127766\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\ \ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\ \ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\ \ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\ \ \"acc_stderr\": 0.025196929874827072,\n \"acc_norm\": 0.8507462686567164,\n\ \ \"acc_norm_stderr\": 0.025196929874827072\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352202,\n \ \ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352202\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\ \ \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n\ \ \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\ \ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3659730722154223,\n\ \ \"mc1_stderr\": 0.016862941684088376,\n \"mc2\": 0.5249573452382528,\n\ \ \"mc2_stderr\": 0.015222313755138895\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7924230465666929,\n \"acc_stderr\": 0.011398593419386795\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\ : 0.0\n }\n}\n```" repo_url: https://huggingface.co/TwT-6/open_llm_leaderboard_demo2 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|arc:challenge|25_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-04-15T16-29-36.546610.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|gsm8k|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hellaswag|10_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-management|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-management|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-04-15T16-29-36.546610.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-international_law|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-management|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-marketing|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-sociology|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-virology|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T16-29-36.546610.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|truthfulqa:mc|0_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-04-15T16-29-36.546610.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_04_15T16_29_36.546610 path: - '**/details_harness|winogrande|5_2024-04-15T16-29-36.546610.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-04-15T16-29-36.546610.parquet' - config_name: results data_files: - split: 2024_04_15T16_29_36.546610 path: - results_2024-04-15T16-29-36.546610.parquet - split: latest path: - results_2024-04-15T16-29-36.546610.parquet --- # Dataset Card for Evaluation run of TwT-6/open_llm_leaderboard_demo2 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [TwT-6/open_llm_leaderboard_demo2](https://huggingface.co/TwT-6/open_llm_leaderboard_demo2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TwT-6__open_llm_leaderboard_demo2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-04-15T16:29:36.546610](https://huggingface.co/datasets/open-llm-leaderboard/details_TwT-6__open_llm_leaderboard_demo2/blob/main/results_2024-04-15T16-29-36.546610.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6463960806225129, "acc_stderr": 0.03156272465458696, "acc_norm": 0.6590262499023557, "acc_norm_stderr": 0.03241582471085193, "mc1": 0.3659730722154223, "mc1_stderr": 0.016862941684088376, "mc2": 0.5249573452382528, "mc2_stderr": 0.015222313755138895 }, "harness|arc:challenge|25": { "acc": 0.5750853242320819, "acc_stderr": 0.014445698968520767, "acc_norm": 0.6237201365187713, "acc_norm_stderr": 0.014157022555407156 }, "harness|hellaswag|10": { "acc": 0.6414060944035053, "acc_stderr": 0.004786075107572191, "acc_norm": 0.8375821549492133, "acc_norm_stderr": 0.0036807989505319135 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6296296296296297, "acc_stderr": 0.041716541613545426, "acc_norm": 0.6296296296296297, "acc_norm_stderr": 0.041716541613545426 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7368421052631579, "acc_stderr": 0.03583496176361074, "acc_norm": 0.7368421052631579, "acc_norm_stderr": 0.03583496176361074 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6981132075471698, "acc_stderr": 0.02825420034443866, "acc_norm": 0.6981132075471698, "acc_norm_stderr": 0.02825420034443866 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7569444444444444, "acc_stderr": 0.03586879280080341, "acc_norm": 0.7569444444444444, "acc_norm_stderr": 0.03586879280080341 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.653179190751445, "acc_stderr": 0.036291466701596636, "acc_norm": 0.653179190751445, "acc_norm_stderr": 0.036291466701596636 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.39215686274509803, "acc_stderr": 0.048580835742663434, "acc_norm": 0.39215686274509803, "acc_norm_stderr": 0.048580835742663434 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.81, "acc_stderr": 0.039427724440366234, "acc_norm": 0.81, "acc_norm_stderr": 0.039427724440366234 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5787234042553191, "acc_stderr": 0.03227834510146268, "acc_norm": 0.5787234042553191, "acc_norm_stderr": 0.03227834510146268 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.593103448275862, "acc_stderr": 0.04093793981266236, "acc_norm": 0.593103448275862, "acc_norm_stderr": 0.04093793981266236 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4656084656084656, "acc_stderr": 0.025690321762493838, "acc_norm": 0.4656084656084656, "acc_norm_stderr": 0.025690321762493838 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4365079365079365, "acc_stderr": 0.04435932892851466, "acc_norm": 0.4365079365079365, "acc_norm_stderr": 0.04435932892851466 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8064516129032258, "acc_stderr": 0.022475258525536057, "acc_norm": 0.8064516129032258, "acc_norm_stderr": 0.022475258525536057 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4729064039408867, "acc_stderr": 0.03512819077876105, "acc_norm": 0.4729064039408867, "acc_norm_stderr": 0.03512819077876105 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8363636363636363, "acc_stderr": 0.02888787239548795, "acc_norm": 0.8363636363636363, "acc_norm_stderr": 0.02888787239548795 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8080808080808081, "acc_stderr": 0.02805779167298902, "acc_norm": 0.8080808080808081, "acc_norm_stderr": 0.02805779167298902 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.917098445595855, "acc_stderr": 0.01989934131572178, "acc_norm": 0.917098445595855, "acc_norm_stderr": 0.01989934131572178 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6666666666666666, "acc_stderr": 0.023901157979402538, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.023901157979402538 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.35555555555555557, "acc_stderr": 0.029185714949857406, "acc_norm": 0.35555555555555557, "acc_norm_stderr": 0.029185714949857406 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6680672268907563, "acc_stderr": 0.03058869701378364, "acc_norm": 0.6680672268907563, "acc_norm_stderr": 0.03058869701378364 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8403669724770643, "acc_stderr": 0.015703498348461763, "acc_norm": 0.8403669724770643, "acc_norm_stderr": 0.015703498348461763 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5833333333333334, "acc_stderr": 0.033622774366080424, "acc_norm": 0.5833333333333334, "acc_norm_stderr": 0.033622774366080424 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8676470588235294, "acc_stderr": 0.023784297520918853, "acc_norm": 0.8676470588235294, "acc_norm_stderr": 0.023784297520918853 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8438818565400844, "acc_stderr": 0.023627159460318677, "acc_norm": 0.8438818565400844, "acc_norm_stderr": 0.023627159460318677 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057222, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057222 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7557251908396947, "acc_stderr": 0.037683359597287434, "acc_norm": 0.7557251908396947, "acc_norm_stderr": 0.037683359597287434 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8016528925619835, "acc_stderr": 0.03640118271990945, "acc_norm": 0.8016528925619835, "acc_norm_stderr": 0.03640118271990945 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252627, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252627 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7484662576687117, "acc_stderr": 0.03408997886857529, "acc_norm": 0.7484662576687117, "acc_norm_stderr": 0.03408997886857529 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.45535714285714285, "acc_stderr": 0.04726835553719099, "acc_norm": 0.45535714285714285, "acc_norm_stderr": 0.04726835553719099 }, "harness|hendrycksTest-management|5": { "acc": 0.8252427184466019, "acc_stderr": 0.03760178006026621, "acc_norm": 0.8252427184466019, "acc_norm_stderr": 0.03760178006026621 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8888888888888888, "acc_stderr": 0.020588491316092368, "acc_norm": 0.8888888888888888, "acc_norm_stderr": 0.020588491316092368 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8288633461047255, "acc_stderr": 0.013468201614066302, "acc_norm": 0.8288633461047255, "acc_norm_stderr": 0.013468201614066302 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7427745664739884, "acc_stderr": 0.02353292543104429, "acc_norm": 0.7427745664739884, "acc_norm_stderr": 0.02353292543104429 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.38324022346368714, "acc_stderr": 0.016260159604429128, "acc_norm": 0.38324022346368714, "acc_norm_stderr": 0.016260159604429128 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7352941176470589, "acc_stderr": 0.025261691219729487, "acc_norm": 0.7352941176470589, "acc_norm_stderr": 0.025261691219729487 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7138263665594855, "acc_stderr": 0.025670259242188936, "acc_norm": 0.7138263665594855, "acc_norm_stderr": 0.025670259242188936 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7376543209876543, "acc_stderr": 0.024477222856135118, "acc_norm": 0.7376543209876543, "acc_norm_stderr": 0.024477222856135118 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5, "acc_stderr": 0.029827499313594685, "acc_norm": 0.5, "acc_norm_stderr": 0.029827499313594685 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.48891786179921776, "acc_stderr": 0.012767098998525834, "acc_norm": 0.48891786179921776, "acc_norm_stderr": 0.012767098998525834 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7426470588235294, "acc_stderr": 0.026556519470041496, "acc_norm": 0.7426470588235294, "acc_norm_stderr": 0.026556519470041496 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6879084967320261, "acc_stderr": 0.01874501120127766, "acc_norm": 0.6879084967320261, "acc_norm_stderr": 0.01874501120127766 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7428571428571429, "acc_stderr": 0.02797982353874455, "acc_norm": 0.7428571428571429, "acc_norm_stderr": 0.02797982353874455 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8507462686567164, "acc_stderr": 0.025196929874827072, "acc_norm": 0.8507462686567164, "acc_norm_stderr": 0.025196929874827072 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.89, "acc_stderr": 0.03144660377352202, "acc_norm": 0.89, "acc_norm_stderr": 0.03144660377352202 }, "harness|hendrycksTest-virology|5": { "acc": 0.5662650602409639, "acc_stderr": 0.03858158940685515, "acc_norm": 0.5662650602409639, "acc_norm_stderr": 0.03858158940685515 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8070175438596491, "acc_stderr": 0.030267457554898458, "acc_norm": 0.8070175438596491, "acc_norm_stderr": 0.030267457554898458 }, "harness|truthfulqa:mc|0": { "mc1": 0.3659730722154223, "mc1_stderr": 0.016862941684088376, "mc2": 0.5249573452382528, "mc2_stderr": 0.015222313755138895 }, "harness|winogrande|5": { "acc": 0.7924230465666929, "acc_stderr": 0.011398593419386795 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
yzhuang/autotree_automl_bank-marketing_sgosdt_l256_d3_sd0
--- dataset_info: features: - name: id dtype: int64 - name: input_x sequence: sequence: float32 - name: input_y sequence: sequence: float32 - name: rtg sequence: float64 - name: status sequence: sequence: float32 - name: split_threshold sequence: sequence: float32 - name: split_dimension sequence: int64 splits: - name: train num_bytes: 174960000 num_examples: 10000 - name: validation num_bytes: 174960000 num_examples: 10000 download_size: 72788389 dataset_size: 349920000 --- # Dataset Card for "autotree_automl_bank-marketing_sgosdt_l256_d3_sd0" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
distilabel-internal-testing/deita-after-conversation
--- dataset_info: features: - name: evolved_instruction dtype: string - name: completion dtype: string - name: meta struct: - name: category dtype: string - name: completion dtype: string - name: id dtype: int64 - name: input dtype: 'null' - name: motivation_app dtype: 'null' - name: prompt dtype: string - name: source dtype: string - name: subcategory dtype: string - name: answer dtype: string - name: model_name dtype: string - name: evol_instruction_score dtype: float64 - name: evolved_response dtype: string - name: evol_response_score dtype: float64 - name: conversation list: - name: content dtype: string - name: role dtype: string splits: - name: train num_bytes: 6923587 num_examples: 1800 download_size: 1022792 dataset_size: 6923587 configs: - config_name: default data_files: - split: train path: data/train-* ---
dclure/laion-aesthetics-12m-umap
--- annotations_creators: [] language: - en language_creators: - found license: - mit multilinguality: - monolingual pretty_name: laion-aesthetics-12m-umap size_categories: [] source_datasets: [] tags: - laion - stable-diffuson - text2img task_categories: [] task_ids: [] --- # LAION-Aesthetics :: CLIP → UMAP This dataset is a CLIP (text) → UMAP embedding of the [LAION-Aesthetics dataset](https://laion.ai/blog/laion-aesthetics/) - specifically the [`improved_aesthetics_6plus` version](https://huggingface.co/datasets/ChristophSchuhmann/improved_aesthetics_6plus), which filters the full dataset to images with scores of > 6 under the "aesthetic" filtering model. Thanks LAION for this amazing corpus! --- The dataset here includes coordinates for 3x separate UMAP fits using different values for the `n_neighbors` parameter - `10`, `30`, and `60` - which are broken out as separate columns with different suffixes: - `n_neighbors=10` → (`x_nn10`, `y_nn10`) - `n_neighbors=30` → (`x_nn30`, `y_nn30`) - `n_neighbors=60` → (`x_nn60`, `y_nn60`) ### `nn10` ![nn10](https://user-images.githubusercontent.com/814168/189763846-efa9ecc9-3d57-469b-9d4e-02ddc1723265.jpg) ### `nn30` ![nn30](https://user-images.githubusercontent.com/814168/189763863-a67d4bb1-e043-48ec-8c5a-38dce960731b.jpg) ### `nn60` (The version from [Twitter](https://twitter.com/clured/status/1565399157606580224).) ![nn60](https://user-images.githubusercontent.com/814168/189763872-5847cde5-e03b-45e1-a9be-d95966bc5ded.jpg) ## Pipeline The script for producing this can be found here: https://github.com/davidmcclure/loam-viz/blob/laion/laion.py And is very simple - just using the `openai/clip-vit-base-patch32` model out-of-the-box to encode the text captions: ```python @app.command() def clip( src: str, dst: str, text_col: str = 'TEXT', limit: Optional[int] = typer.Option(None), batch_size: int = typer.Option(512), ): """Embed with CLIP.""" df = pd.read_parquet(src) if limit: df = df.head(limit) tokenizer = CLIPTokenizerFast.from_pretrained('openai/clip-vit-base-patch32') model = CLIPTextModel.from_pretrained('openai/clip-vit-base-patch32') model = model.to(device) texts = df[text_col].tolist() embeds = [] for batch in chunked_iter(tqdm(texts), batch_size): enc = tokenizer( batch, return_tensors='pt', padding=True, truncation=True, ) enc = enc.to(device) with torch.no_grad(): res = model(**enc) embeds.append(res.pooler_output.to('cpu')) embeds = torch.cat(embeds).numpy() np.save(dst, embeds) print(embeds.shape) ``` Then using `cuml.GaussianRandomProjection` to do an initial squeeze to 64d (which gets the embedding tensor small enough to fit onto a single GPU for the UMAP) - ```python @app.command() def random_projection(src: str, dst: str, dim: int = 64): """Random projection on an embedding matrix.""" rmm.reinitialize(managed_memory=True) embeds = np.load(src) rp = cuml.GaussianRandomProjection(n_components=dim) embeds = rp.fit_transform(embeds) np.save(dst, embeds) print(embeds.shape) ``` And then `cuml.UMAP` to get from 64d -> 2d - ```python @app.command() def umap( df_src: str, embeds_src: str, dst: str, n_neighbors: int = typer.Option(30), n_epochs: int = typer.Option(1000), negative_sample_rate: int = typer.Option(20), ): """UMAP to 2d.""" rmm.reinitialize(managed_memory=True) df = pd.read_parquet(df_src) embeds = np.load(embeds_src) embeds = embeds.astype('float16') print(embeds.shape) print(embeds.dtype) reducer = cuml.UMAP( n_neighbors=n_neighbors, n_epochs=n_epochs, negative_sample_rate=negative_sample_rate, verbose=True, ) x = reducer.fit_transform(embeds) df['x'] = x[:,0] df['y'] = x[:,1] df.to_parquet(dst) print(df) ```
Ali-C137/Arabic_guanaco_oasst1
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 20962143 num_examples: 9846 - name: test num_bytes: 1102534 num_examples: 518 download_size: 10417464 dataset_size: 22064677 license: apache-2.0 language: - ar size_categories: - 1K<n<10K --- # Dataset Card for "Arabic_guanaco_oasst1" This dataset is the openassistant-guanaco dataset a subset of the Open Assistant dataset translated to Arabic. You can find the original dataset here: https://huggingface.co/datasets/timdettmers/openassistant-guanaco Or the main dataset here: https://huggingface.co/datasets/OpenAssistant/oasst1/tree/main This subset of the data only contains the highest-rated paths in the conversation tree, with a total of 9,846 samples. For further information, please see the main dataset. License: Apache 2.0 [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
liuyanchen1015/MULTI_VALUE_wnli_completive_done
--- dataset_info: features: - name: sentence1 dtype: string - name: sentence2 dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: value_score dtype: int64 splits: - name: dev num_bytes: 2794 num_examples: 12 - name: test num_bytes: 16167 num_examples: 57 - name: train num_bytes: 29881 num_examples: 129 download_size: 23959 dataset_size: 48842 --- # Dataset Card for "MULTI_VALUE_wnli_completive_done" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-_fil_self_160m_bo2_100_kl_0.1_prm_70m_thr_0.1_seed_1
--- dataset_info: config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1 features: - name: instruction dtype: string - name: input dtype: string - name: output dtype: string - name: preference dtype: int64 - name: output_1 dtype: string - name: output_2 dtype: string - name: reward_model_prompt_format dtype: string - name: gen_prompt_format dtype: string - name: gen_kwargs struct: - name: do_sample dtype: bool - name: max_new_tokens dtype: int64 - name: pad_token_id dtype: int64 - name: top_k dtype: int64 - name: top_p dtype: float64 - name: reward_1 dtype: float64 - name: reward_2 dtype: float64 - name: n_samples dtype: int64 - name: reject_select dtype: string - name: prompt dtype: string - name: chosen dtype: string - name: rejected dtype: string - name: index dtype: int64 - name: filtered_epoch dtype: int64 - name: gen_reward dtype: float64 - name: gen_response dtype: string splits: - name: epoch_0 num_bytes: 43597508 num_examples: 18929 - name: epoch_1 num_bytes: 44108270 num_examples: 18929 - name: epoch_2 num_bytes: 44197336 num_examples: 18929 - name: epoch_3 num_bytes: 44234242 num_examples: 18929 - name: epoch_4 num_bytes: 44258182 num_examples: 18929 - name: epoch_5 num_bytes: 44267278 num_examples: 18929 - name: epoch_6 num_bytes: 44275079 num_examples: 18929 - name: epoch_7 num_bytes: 44279821 num_examples: 18929 - name: epoch_8 num_bytes: 44284005 num_examples: 18929 - name: epoch_9 num_bytes: 44284855 num_examples: 18929 - name: epoch_10 num_bytes: 44284403 num_examples: 18929 - name: epoch_11 num_bytes: 44285728 num_examples: 18929 - name: epoch_12 num_bytes: 44284649 num_examples: 18929 - name: epoch_13 num_bytes: 44286198 num_examples: 18929 - name: epoch_14 num_bytes: 44285578 num_examples: 18929 - name: epoch_15 num_bytes: 44286582 num_examples: 18929 - name: epoch_16 num_bytes: 44286800 num_examples: 18929 - name: epoch_17 num_bytes: 44286456 num_examples: 18929 - name: epoch_18 num_bytes: 44286747 num_examples: 18929 - name: epoch_19 num_bytes: 44286328 num_examples: 18929 - name: epoch_20 num_bytes: 44286881 num_examples: 18929 - name: epoch_21 num_bytes: 44286436 num_examples: 18929 - name: epoch_22 num_bytes: 44287105 num_examples: 18929 - name: epoch_23 num_bytes: 44286814 num_examples: 18929 - name: epoch_24 num_bytes: 44287600 num_examples: 18929 - name: epoch_25 num_bytes: 44287112 num_examples: 18929 - name: epoch_26 num_bytes: 44287905 num_examples: 18929 - name: epoch_27 num_bytes: 44287971 num_examples: 18929 - name: epoch_28 num_bytes: 44288490 num_examples: 18929 - name: epoch_29 num_bytes: 44287015 num_examples: 18929 download_size: 699430221 dataset_size: 1327519374 configs: - config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1 data_files: - split: epoch_0 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-* - split: epoch_1 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-* - split: epoch_2 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-* - split: epoch_3 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-* - split: epoch_4 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-* - split: epoch_5 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-* - split: epoch_6 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-* - split: epoch_7 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-* - split: epoch_8 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-* - split: epoch_9 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-* - split: epoch_10 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-* - split: epoch_11 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-* - split: epoch_12 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-* - split: epoch_13 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-* - split: epoch_14 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-* - split: epoch_15 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-* - split: epoch_16 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-* - split: epoch_17 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-* - split: epoch_18 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-* - split: epoch_19 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-* - split: epoch_20 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-* - split: epoch_21 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-* - split: epoch_22 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-* - split: epoch_23 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-* - split: epoch_24 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-* - split: epoch_25 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-* - split: epoch_26 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-* - split: epoch_27 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-* - split: epoch_28 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-* - split: epoch_29 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-* ---
jilp00/youtoks-curious-amalgam-v2-chatml
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 6377681 num_examples: 9358 download_size: 2582238 dataset_size: 6377681 configs: - config_name: default data_files: - split: train path: data/train-* ---
cwiz/igor-gofman-text
--- license: apache-2.0 --- Полный корпус изречений и постов Игоря Гофмана
peter2000/ecoicop_online_product
--- license: cc task_categories: - text-classification language: - de - fr - it size_categories: - 10K<n<100K ---
dmrau/cqadupstack-tex
--- configs: - config_name: default data_files: - split: queries path: data/queries-* - split: corpus path: data/corpus-* dataset_info: features: - name: _id dtype: string - name: text dtype: string - name: title dtype: string splits: - name: queries num_bytes: 186934 num_examples: 2906 - name: corpus num_bytes: 86600423 num_examples: 68184 download_size: 43424126 dataset_size: 86787357 --- # Dataset Card for "cqadupstack-tex" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Abirami/tamilwikipedia
--- license: other ---
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-_fil_self_1.4b_bo2_100_kl_0.1_prm_410m_thr_0.3_seed_2
--- dataset_info: config_name: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500 features: - name: instruction dtype: string - name: input dtype: string - name: output dtype: string - name: preference dtype: int64 - name: output_1 dtype: string - name: output_2 dtype: string - name: reward_model_prompt_format dtype: string - name: gen_prompt_format dtype: string - name: gen_kwargs struct: - name: do_sample dtype: bool - name: max_new_tokens dtype: int64 - name: pad_token_id dtype: int64 - name: top_k dtype: int64 - name: top_p dtype: float64 - name: reward_1 dtype: float64 - name: reward_2 dtype: float64 - name: n_samples dtype: int64 - name: reject_select dtype: string - name: prompt dtype: string - name: chosen dtype: string - name: rejected dtype: string - name: index dtype: int64 - name: filtered_epoch dtype: int64 - name: gen_reward dtype: float64 - name: gen_response dtype: string splits: - name: epoch_0 num_bytes: 43590759 num_examples: 18929 - name: epoch_1 num_bytes: 43797842 num_examples: 18929 - name: epoch_2 num_bytes: 43783778 num_examples: 18929 - name: epoch_3 num_bytes: 43722686 num_examples: 18929 - name: epoch_4 num_bytes: 43676560 num_examples: 18929 - name: epoch_5 num_bytes: 43644514 num_examples: 18929 - name: epoch_6 num_bytes: 43629730 num_examples: 18929 - name: epoch_7 num_bytes: 43614988 num_examples: 18929 - name: epoch_8 num_bytes: 43615944 num_examples: 18929 - name: epoch_9 num_bytes: 43610500 num_examples: 18929 download_size: 231996117 dataset_size: 436687301 configs: - config_name: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500 data_files: - split: epoch_0 path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_0-* - split: epoch_1 path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_1-* - split: epoch_2 path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_2-* - split: epoch_3 path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_3-* - split: epoch_4 path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_4-* - split: epoch_5 path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_5-* - split: epoch_6 path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_6-* - split: epoch_7 path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_7-* - split: epoch_8 path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_8-* - split: epoch_9 path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_9-* ---
Ngat/NgatNang
--- license: creativeml-openrail-m ---
Shoubhik8/mpt_finetune_dataset
--- dataset_info: features: - name: prompt dtype: string - name: response dtype: string splits: - name: train num_bytes: 331283580 num_examples: 371277 download_size: 13534489 dataset_size: 331283580 --- # Dataset Card for "mpt_finetune_dataset" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
hczhu/TickerTick-stock-news
--- license: mit --- https://github.com/hczhu/TickerTick-API/releases
Zombely/fiszki-ocr-train
--- dataset_info: features: - name: image dtype: image - name: ground_truth dtype: string splits: - name: train num_bytes: 354017910.0 num_examples: 85 - name: validation num_bytes: 56459717.0 num_examples: 14 download_size: 410390428 dataset_size: 410477627.0 --- # Dataset Card for "fiszki-ocr-train" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
luona/datasetplayground
--- license: apache-2.0 ---
mbarnig/Tatoeba-en-lb
--- license: cc-by-nc-sa-4.0 ---
naorm/all-captions-screen2words-16bit-blip2
--- dataset_info: features: - name: image dtype: image - name: hf-blip2-16bit dtype: string - name: hf-blip2-coco-16bit dtype: string splits: - name: train num_bytes: 448040117.41 num_examples: 4310 download_size: 362303052 dataset_size: 448040117.41 configs: - config_name: default data_files: - split: train path: data/train-* ---
rcds/swiss_rulings
--- license: cc-by-sa-4.0 language: - it - de - fr pretty_name: Swiss Rulings size_categories: - 100K<n<1M --- # Dataset Card for Swiss Rulings ## Table of Contents - [Table of Contents](#table-of-contents) - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** - **Repository:** - **Paper:** - **Leaderboard:** - **Point of Contact:** ### Dataset Summary SwissRulings is a multilingual, diachronic dataset of 637K Swiss Federal Supreme Court (FSCS) cases. This dataset can be used to pretrain language models on Swiss legal data. ### Supported Tasks and Leaderboards ### Languages Switzerland has four official languages with three languages German, French and Italian being represenated. The decisions are written by the judges and clerks in the language of the proceedings. | Language | Subset | Number of Documents Full | |------------|------------|--------------------------| | German | **de** | 319K | | French | **fr** | 246K | | Italian | **it** | 71K | ## Dataset Structure ### Data Fields ``` decision_id (string) facts (string) considerations (string) origin_facts (string) origin_considerations (string) law_area (string) language (string) year (int32) court (string) chamber (string) canton (string) region (string) ``` ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization The original data are published from the Swiss Federal Supreme Court (https://www.bger.ch) in unprocessed formats (HTML). The documents were downloaded from the Entscheidsuche portal (https://entscheidsuche.ch) in HTML. #### Who are the source language producers? The decisions are written by the judges and clerks in the language of the proceedings. ### Annotations #### Annotation process #### Who are the annotators? Metadata is published by the Swiss Federal Supreme Court (https://www.bger.ch). ### Personal and Sensitive Information The dataset contains publicly available court decisions from the Swiss Federal Supreme Court. Personal or sensitive information has been anonymized by the court before publication according to the following guidelines: https://www.bger.ch/home/juridiction/anonymisierungsregeln.html. ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information We release the data under CC-BY-4.0 which complies with the court licensing (https://www.bger.ch/files/live/sites/bger/files/pdf/de/urteilsveroeffentlichung_d.pdf) © Swiss Federal Supreme Court, 2002-2022 The copyright for the editorial content of this website and the consolidated texts, which is owned by the Swiss Federal Supreme Court, is licensed under the Creative Commons Attribution 4.0 International licence. This means that you can re-use the content provided you acknowledge the source and indicate any changes you have made. Source: https://www.bger.ch/files/live/sites/bger/files/pdf/de/urteilsveroeffentlichung_d.pdf ### Citation Information Please cite our [ArXiv-Preprint](https://arxiv.org/abs/2306.09237) ``` @misc{rasiah2023scale, title={SCALE: Scaling up the Complexity for Advanced Language Model Evaluation}, author={Vishvaksenan Rasiah and Ronja Stern and Veton Matoshi and Matthias Stürmer and Ilias Chalkidis and Daniel E. Ho and Joel Niklaus}, year={2023}, eprint={2306.09237}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` ### Contributions