datasetId
stringlengths
2
117
card
stringlengths
19
1.01M
CyberHarem/eimi_bluearchive
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of eimi/和泉元エイミ/艾米 (Blue Archive) This is the dataset of eimi/和泉元エイミ/艾米 (Blue Archive), containing 500 images and their tags. The core tags of this character are `pink_hair, breasts, halo, long_hair, large_breasts, pink_eyes, pink_halo, goggles_on_head, ponytail, purple_eyes, very_long_hair`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 999.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eimi_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 1200 | 500 | 812.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eimi_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1346 | 1.69 GiB | [Download](https://huggingface.co/datasets/CyberHarem/eimi_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/eimi_bluearchive', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1boy, 1girl, blush, hetero, official_alternate_costume, ski_goggles, solo_focus, black_bikini, paizuri, pov_crotch, huge_breasts, looking_at_viewer, nipples, breasts_squeezed_together, pink_jacket, closed_mouth, long_sleeves, penis, smile, mosaic_censoring, open_mouth, sweat, white_scarf | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, :o, bare_shoulders, black_bikini, blush, cleavage, cowboy_shot, long_sleeves, looking_at_viewer, navel, official_alternate_costume, open_jacket, open_mouth, pink_jacket, side-tie_bikini_bottom, solo, stomach, thighs, white_scarf, off_shoulder, sideboob, simple_background, ski_goggles, white_background | | 2 | 9 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, black_bikini, blush, cleavage, long_sleeves, looking_at_viewer, navel, official_alternate_costume, open_jacket, open_mouth, pink_jacket, side-tie_bikini_bottom, ski_goggles, solo, stomach, white_scarf, outdoors, cowboy_shot, :o | | 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, animal, black_bikini, blush, cleavage, cowboy_shot, goggles, long_sleeves, looking_at_viewer, navel, official_alternate_costume, open_jacket, penguin, pink_jacket, side-tie_bikini_bottom, solo, stomach, white_scarf, outdoors, open_mouth, day | | 4 | 7 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, bare_shoulders, black_bikini, black_jacket, headphones_around_neck, long_sleeves, looking_at_viewer, necktie_between_breasts, open_clothes, red_necktie, solo, white_shirt, zipper_pull_tab, bandaid, blush, cleavage, off_shoulder, parted_lips, hair_ornament, skirt | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, black_bikini, blush, cleavage, headphones_around_neck, long_sleeves, looking_at_viewer, necktie_between_breasts, simple_background, solo, white_background, white_shirt, black_jacket, huge_breasts, open_shirt, red_necktie, upper_body, zipper_pull_tab, open_mouth, bandaid, hair_ornament, sideboob | | 6 | 5 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1boy, 1girl, blush, hetero, penis, pussy, solo_focus, vaginal, hair_ornament, missionary, necktie_between_breasts, nipples, on_back, pillow, red_necktie, spread_legs, black_bikini, headphones_around_neck, looking_at_viewer, mosaic_censoring, open_mouth, zipper, bandaid, bar_censor, black_jacket, clothed_sex, navel, open_clothes, sweat, white_shirt, white_skirt | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1boy | 1girl | blush | hetero | official_alternate_costume | ski_goggles | solo_focus | black_bikini | paizuri | pov_crotch | huge_breasts | looking_at_viewer | nipples | breasts_squeezed_together | pink_jacket | closed_mouth | long_sleeves | penis | smile | mosaic_censoring | open_mouth | sweat | white_scarf | :o | bare_shoulders | cleavage | cowboy_shot | navel | open_jacket | side-tie_bikini_bottom | solo | stomach | thighs | off_shoulder | sideboob | simple_background | white_background | outdoors | animal | goggles | penguin | day | black_jacket | headphones_around_neck | necktie_between_breasts | open_clothes | red_necktie | white_shirt | zipper_pull_tab | bandaid | parted_lips | hair_ornament | skirt | open_shirt | upper_body | pussy | vaginal | missionary | on_back | pillow | spread_legs | zipper | bar_censor | clothed_sex | white_skirt | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------|:--------|:--------|:---------|:-----------------------------|:--------------|:-------------|:---------------|:----------|:-------------|:---------------|:--------------------|:----------|:----------------------------|:--------------|:---------------|:---------------|:--------|:--------|:-------------------|:-------------|:--------|:--------------|:-----|:-----------------|:-----------|:--------------|:--------|:--------------|:-------------------------|:-------|:----------|:---------|:---------------|:-----------|:--------------------|:-------------------|:-----------|:---------|:----------|:----------|:------|:---------------|:-------------------------|:--------------------------|:---------------|:--------------|:--------------|:------------------|:----------|:--------------|:----------------|:--------|:-------------|:-------------|:--------|:----------|:-------------|:----------|:---------|:--------------|:---------|:-------------|:--------------|:--------------| | 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | | X | X | | X | X | | X | | | | X | | | X | | X | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 9 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | | X | X | | X | X | | X | | | | X | | | X | | X | | | | X | | X | X | | X | X | X | X | X | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | | X | X | | X | | | X | | | | X | | | X | | X | | | | X | | X | | | X | X | X | X | X | X | X | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 7 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | | X | X | | | | | X | | | | X | | | | | X | | | | | | | | X | X | | | | | X | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | | X | X | | | | | X | | | X | X | | | | | X | | | | X | | | | | X | | | | | X | | | | X | X | X | | | | | | X | X | X | | X | X | X | X | | X | | X | X | | | | | | | | | | | | 6 | 5 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | X | X | | | X | X | | | | X | X | | | | | X | | X | X | X | | | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | | X | | X | | | | X | X | X | X | X | X | X | X | X | X |
CyberHarem/viola_pokemon
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of viola/ビオラ (Pokémon) This is the dataset of viola/ビオラ (Pokémon), containing 242 images and their tags. The core tags of this character are `blonde_hair, green_eyes, breasts, large_breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 242 | 208.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/viola_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 242 | 136.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/viola_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 506 | 264.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/viola_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 242 | 190.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/viola_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 506 | 350.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/viola_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/viola_pokemon', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, crop_top, green_pants, sleeveless_shirt, white_shirt, wristband, open_mouth, tongue, :d, holding_camera, midriff, eyelashes, solo, looking_at_viewer, pokemon_(creature), upper_teeth_only, white_belt | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1boy, 1girl, blush, hetero, paizuri, cum_on_breasts, huge_breasts, open_mouth, penis, smile, solo_focus, nipples, ejaculation, looking_at_viewer, shirt_lift | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | crop_top | green_pants | sleeveless_shirt | white_shirt | wristband | open_mouth | tongue | :d | holding_camera | midriff | eyelashes | solo | looking_at_viewer | pokemon_(creature) | upper_teeth_only | white_belt | 1boy | blush | hetero | paizuri | cum_on_breasts | huge_breasts | penis | smile | solo_focus | nipples | ejaculation | shirt_lift | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:--------------|:-------------------|:--------------|:------------|:-------------|:---------|:-----|:-----------------|:----------|:------------|:-------|:--------------------|:---------------------|:-------------------|:-------------|:-------|:--------|:---------|:----------|:-----------------|:---------------|:--------|:--------|:-------------|:----------|:--------------|:-------------| | 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | | | | | X | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X |
kri-ti-ka1/guanaco-llama2-1k
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 1654448 num_examples: 1000 download_size: 966692 dataset_size: 1654448 configs: - config_name: default data_files: - split: train path: data/train-* ---
open-llm-leaderboard/details_pankajmathur__Mistral-7B-model_45k6e2e4
--- pretty_name: Evaluation run of pankajmathur/Mistral-7B-model_45k6e2e4 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [pankajmathur/Mistral-7B-model_45k6e2e4](https://huggingface.co/pankajmathur/Mistral-7B-model_45k6e2e4)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pankajmathur__Mistral-7B-model_45k6e2e4_public\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-11-08T12:00:55.074514](https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__Mistral-7B-model_45k6e2e4_public/blob/main/results_2023-11-08T12-00-55.074514.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23184197178254598,\n\ \ \"acc_stderr\": 0.030693965962788314,\n \"acc_norm\": 0.23241967500614574,\n\ \ \"acc_norm_stderr\": 0.030706854185546608,\n \"mc1\": 0.24357405140758873,\n\ \ \"mc1_stderr\": 0.01502635482491078,\n \"mc2\": 0.5084657838244592,\n\ \ \"mc2_stderr\": 0.016201328114036084\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.2022184300341297,\n \"acc_stderr\": 0.011737454431872104,\n\ \ \"acc_norm\": 0.2431740614334471,\n \"acc_norm_stderr\": 0.01253655414458709\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2577175861382195,\n\ \ \"acc_stderr\": 0.004364838000335622,\n \"acc_norm\": 0.2508464449312886,\n\ \ \"acc_norm_stderr\": 0.0043261434303600976\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \ \ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\ \ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\ \ \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n\ \ \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\ \ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\ \ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \ \ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\ \ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\ \ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\ \ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \ \ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n\ \ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653695,\n \ \ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653695\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\ \ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\ \ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\ \ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\ \ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\ \ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\ \ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\ \ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\ \ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.24867724867724866,\n \"acc_stderr\": 0.022261817692400175,\n \"\ acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.022261817692400175\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n\ \ \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n\ \ \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \ \ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\ acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938062,\n \"\ acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938062\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\"\ : 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\ \ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\ acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\ \ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.20512820512820512,\n \"acc_stderr\": 0.02047323317355198,\n\ \ \"acc_norm\": 0.20512820512820512,\n \"acc_norm_stderr\": 0.02047323317355198\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.2074074074074074,\n \"acc_stderr\": 0.024720713193952165,\n \ \ \"acc_norm\": 0.2074074074074074,\n \"acc_norm_stderr\": 0.024720713193952165\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\ \ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.18543046357615894,\n \"acc_stderr\": 0.03173284384294285,\n \"\ acc_norm\": 0.18543046357615894,\n \"acc_norm_stderr\": 0.03173284384294285\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\ acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\ acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\ \ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\ : {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\ \ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\ \ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\ \ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\ \ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\ acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\ \ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\ \ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\ \ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\ \ \"acc_stderr\": 0.04464285714285713,\n \"acc_norm\": 0.33035714285714285,\n\ \ \"acc_norm_stderr\": 0.04464285714285713\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\ \ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\ \ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\ \ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\ \ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\ \ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\ \ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\ \ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\ \ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\ \ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\ \ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\ \ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\ \ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432414,\n \ \ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432414\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\ \ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\ \ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\ \ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\ : 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\ : {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\ \ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\ \ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\ \ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\ \ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\ : {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\ \ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\ \ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\ \ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\ \ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\ : {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\ \ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\ \ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\ \ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\ \ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\ : {\n \"mc1\": 0.24357405140758873,\n \"mc1_stderr\": 0.01502635482491078,\n\ \ \"mc2\": 0.5084657838244592,\n \"mc2_stderr\": 0.016201328114036084\n\ \ }\n}\n```" repo_url: https://huggingface.co/pankajmathur/Mistral-7B-model_45k6e2e4 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|arc:challenge|25_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hellaswag|10_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-management|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-management|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-11-08T12-00-55.074514.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-international_law|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-management|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-marketing|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-sociology|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-virology|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-11-08T12-00-55.074514.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_11_08T12_00_55.074514 path: - '**/details_harness|truthfulqa:mc|0_2023-11-08T12-00-55.074514.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-11-08T12-00-55.074514.parquet' - config_name: results data_files: - split: 2023_11_08T12_00_55.074514 path: - results_2023-11-08T12-00-55.074514.parquet - split: latest path: - results_2023-11-08T12-00-55.074514.parquet --- # Dataset Card for Evaluation run of pankajmathur/Mistral-7B-model_45k6e2e4 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/pankajmathur/Mistral-7B-model_45k6e2e4 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [pankajmathur/Mistral-7B-model_45k6e2e4](https://huggingface.co/pankajmathur/Mistral-7B-model_45k6e2e4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_pankajmathur__Mistral-7B-model_45k6e2e4_public", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-11-08T12:00:55.074514](https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__Mistral-7B-model_45k6e2e4_public/blob/main/results_2023-11-08T12-00-55.074514.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.23184197178254598, "acc_stderr": 0.030693965962788314, "acc_norm": 0.23241967500614574, "acc_norm_stderr": 0.030706854185546608, "mc1": 0.24357405140758873, "mc1_stderr": 0.01502635482491078, "mc2": 0.5084657838244592, "mc2_stderr": 0.016201328114036084 }, "harness|arc:challenge|25": { "acc": 0.2022184300341297, "acc_stderr": 0.011737454431872104, "acc_norm": 0.2431740614334471, "acc_norm_stderr": 0.01253655414458709 }, "harness|hellaswag|10": { "acc": 0.2577175861382195, "acc_stderr": 0.004364838000335622, "acc_norm": 0.2508464449312886, "acc_norm_stderr": 0.0043261434303600976 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.17, "acc_stderr": 0.0377525168068637, "acc_norm": 0.17, "acc_norm_stderr": 0.0377525168068637 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.18518518518518517, "acc_stderr": 0.03355677216313142, "acc_norm": 0.18518518518518517, "acc_norm_stderr": 0.03355677216313142 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.17763157894736842, "acc_stderr": 0.031103182383123398, "acc_norm": 0.17763157894736842, "acc_norm_stderr": 0.031103182383123398 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.21509433962264152, "acc_stderr": 0.02528839450289137, "acc_norm": 0.21509433962264152, "acc_norm_stderr": 0.02528839450289137 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2569444444444444, "acc_stderr": 0.03653946969442099, "acc_norm": 0.2569444444444444, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.24, "acc_stderr": 0.04292346959909284, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909284 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.18, "acc_stderr": 0.03861229196653695, "acc_norm": 0.18, "acc_norm_stderr": 0.03861229196653695 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.20809248554913296, "acc_stderr": 0.030952890217749874, "acc_norm": 0.20809248554913296, "acc_norm_stderr": 0.030952890217749874 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.21568627450980393, "acc_stderr": 0.04092563958237654, "acc_norm": 0.21568627450980393, "acc_norm_stderr": 0.04092563958237654 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.26382978723404255, "acc_stderr": 0.028809989854102973, "acc_norm": 0.26382978723404255, "acc_norm_stderr": 0.028809989854102973 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.23684210526315788, "acc_stderr": 0.039994238792813365, "acc_norm": 0.23684210526315788, "acc_norm_stderr": 0.039994238792813365 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.23448275862068965, "acc_stderr": 0.035306258743465914, "acc_norm": 0.23448275862068965, "acc_norm_stderr": 0.035306258743465914 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.24867724867724866, "acc_stderr": 0.022261817692400175, "acc_norm": 0.24867724867724866, "acc_norm_stderr": 0.022261817692400175 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.30158730158730157, "acc_stderr": 0.04104947269903394, "acc_norm": 0.30158730158730157, "acc_norm_stderr": 0.04104947269903394 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.18, "acc_stderr": 0.038612291966536934, "acc_norm": 0.18, "acc_norm_stderr": 0.038612291966536934 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.1774193548387097, "acc_stderr": 0.02173254068932927, "acc_norm": 0.1774193548387097, "acc_norm_stderr": 0.02173254068932927 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.15270935960591134, "acc_stderr": 0.02530890453938062, "acc_norm": 0.15270935960591134, "acc_norm_stderr": 0.02530890453938062 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.27, "acc_stderr": 0.0446196043338474, "acc_norm": 0.27, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03225078108306289, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.17676767676767677, "acc_stderr": 0.027178752639044915, "acc_norm": 0.17676767676767677, "acc_norm_stderr": 0.027178752639044915 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.19689119170984457, "acc_stderr": 0.028697873971860664, "acc_norm": 0.19689119170984457, "acc_norm_stderr": 0.028697873971860664 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.20512820512820512, "acc_stderr": 0.02047323317355198, "acc_norm": 0.20512820512820512, "acc_norm_stderr": 0.02047323317355198 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2074074074074074, "acc_stderr": 0.024720713193952165, "acc_norm": 0.2074074074074074, "acc_norm_stderr": 0.024720713193952165 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.21008403361344538, "acc_stderr": 0.026461398717471874, "acc_norm": 0.21008403361344538, "acc_norm_stderr": 0.026461398717471874 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.18543046357615894, "acc_stderr": 0.03173284384294285, "acc_norm": 0.18543046357615894, "acc_norm_stderr": 0.03173284384294285 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.1926605504587156, "acc_stderr": 0.016909276884936094, "acc_norm": 0.1926605504587156, "acc_norm_stderr": 0.016909276884936094 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.1527777777777778, "acc_stderr": 0.024536326026134224, "acc_norm": 0.1527777777777778, "acc_norm_stderr": 0.024536326026134224 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.25, "acc_stderr": 0.03039153369274154, "acc_norm": 0.25, "acc_norm_stderr": 0.03039153369274154 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.270042194092827, "acc_stderr": 0.028900721906293426, "acc_norm": 0.270042194092827, "acc_norm_stderr": 0.028900721906293426 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.31390134529147984, "acc_stderr": 0.031146796482972465, "acc_norm": 0.31390134529147984, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2595419847328244, "acc_stderr": 0.03844876139785271, "acc_norm": 0.2595419847328244, "acc_norm_stderr": 0.03844876139785271 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2396694214876033, "acc_stderr": 0.03896878985070417, "acc_norm": 0.2396694214876033, "acc_norm_stderr": 0.03896878985070417 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.25925925925925924, "acc_stderr": 0.042365112580946336, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.042365112580946336 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.22085889570552147, "acc_stderr": 0.032591773927421776, "acc_norm": 0.22085889570552147, "acc_norm_stderr": 0.032591773927421776 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.33035714285714285, "acc_stderr": 0.04464285714285713, "acc_norm": 0.33035714285714285, "acc_norm_stderr": 0.04464285714285713 }, "harness|hendrycksTest-management|5": { "acc": 0.17475728155339806, "acc_stderr": 0.037601780060266224, "acc_norm": 0.17475728155339806, "acc_norm_stderr": 0.037601780060266224 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2905982905982906, "acc_stderr": 0.02974504857267404, "acc_norm": 0.2905982905982906, "acc_norm_stderr": 0.02974504857267404 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.23754789272030652, "acc_stderr": 0.015218733046150193, "acc_norm": 0.23754789272030652, "acc_norm_stderr": 0.015218733046150193 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.24855491329479767, "acc_stderr": 0.023267528432100174, "acc_norm": 0.24855491329479767, "acc_norm_stderr": 0.023267528432100174 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23798882681564246, "acc_stderr": 0.014242630070574915, "acc_norm": 0.23798882681564246, "acc_norm_stderr": 0.014242630070574915 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.22549019607843138, "acc_stderr": 0.023929155517351284, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.023929155517351284 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.1864951768488746, "acc_stderr": 0.02212243977248077, "acc_norm": 0.1864951768488746, "acc_norm_stderr": 0.02212243977248077 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.21604938271604937, "acc_stderr": 0.022899162918445806, "acc_norm": 0.21604938271604937, "acc_norm_stderr": 0.022899162918445806 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.23404255319148937, "acc_stderr": 0.025257861359432414, "acc_norm": 0.23404255319148937, "acc_norm_stderr": 0.025257861359432414 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2457627118644068, "acc_stderr": 0.010996156635142692, "acc_norm": 0.2457627118644068, "acc_norm_stderr": 0.010996156635142692 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.18382352941176472, "acc_stderr": 0.023529242185193106, "acc_norm": 0.18382352941176472, "acc_norm_stderr": 0.023529242185193106 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.25, "acc_stderr": 0.01751781884501444, "acc_norm": 0.25, "acc_norm_stderr": 0.01751781884501444 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03955932861795833, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03955932861795833 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.18775510204081633, "acc_stderr": 0.02500025603954621, "acc_norm": 0.18775510204081633, "acc_norm_stderr": 0.02500025603954621 }, "harness|hendrycksTest-sociology|5": { "acc": 0.24378109452736318, "acc_stderr": 0.03036049015401465, "acc_norm": 0.24378109452736318, "acc_norm_stderr": 0.03036049015401465 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-virology|5": { "acc": 0.28313253012048195, "acc_stderr": 0.03507295431370518, "acc_norm": 0.28313253012048195, "acc_norm_stderr": 0.03507295431370518 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.3216374269005848, "acc_stderr": 0.03582529442573122, "acc_norm": 0.3216374269005848, "acc_norm_stderr": 0.03582529442573122 }, "harness|truthfulqa:mc|0": { "mc1": 0.24357405140758873, "mc1_stderr": 0.01502635482491078, "mc2": 0.5084657838244592, "mc2_stderr": 0.016201328114036084 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
Sultu/Azulamigos
--- license: openrail ---
christinacdl/HatEval_2019_Test_Set_Task5
--- license: apache-2.0 language: - en --- Test Set from HatEval (Basile et al, 2019), SemEval-2019 Task 5
yjernite/prof_images_blip__22h-vintedois-diffusion-v0-1
--- dataset_info: features: - name: images dtype: image - name: embeddings sequence: float32 splits: - name: bartender num_bytes: 4221598.0 num_examples: 100 - name: accountant num_bytes: 3206485.0 num_examples: 100 - name: baker num_bytes: 3871443.0 num_examples: 100 - name: artist num_bytes: 4244089.0 num_examples: 100 - name: author num_bytes: 3813285.0 num_examples: 100 - name: clergy num_bytes: 3282554.0 num_examples: 100 - name: customer_service_representative num_bytes: 3217003.0 num_examples: 100 - name: dental_hygienist num_bytes: 3079331.0 num_examples: 100 - name: electrician num_bytes: 4371703.0 num_examples: 100 - name: carpet_installer num_bytes: 4389212.0 num_examples: 100 - name: civil_engineer num_bytes: 3841611.0 num_examples: 100 - name: ceo num_bytes: 2997987.0 num_examples: 100 - name: computer_support_specialist num_bytes: 3641931.0 num_examples: 100 - name: dentist num_bytes: 3104962.0 num_examples: 100 - name: butcher num_bytes: 4351854.0 num_examples: 100 - name: courier num_bytes: 3640022.0 num_examples: 100 - name: computer_programmer num_bytes: 4180355.0 num_examples: 100 - name: correctional_officer num_bytes: 4070069.0 num_examples: 100 - name: executive_assistant num_bytes: 3199680.0 num_examples: 100 - name: designer num_bytes: 3433880.0 num_examples: 100 - name: aerospace_engineer num_bytes: 4278650.0 num_examples: 100 - name: data_entry_keyer num_bytes: 3900333.0 num_examples: 100 - name: event_planner num_bytes: 3547339.0 num_examples: 100 - name: cook num_bytes: 3467370.0 num_examples: 100 - name: construction_worker num_bytes: 3894234.0 num_examples: 100 - name: air_conditioning_installer num_bytes: 4217322.0 num_examples: 100 - name: electrical_engineer num_bytes: 4562412.0 num_examples: 100 - name: career_counselor num_bytes: 3415428.0 num_examples: 100 - name: clerk num_bytes: 3213913.0 num_examples: 100 - name: director num_bytes: 3305172.0 num_examples: 100 - name: cleaner num_bytes: 3475664.0 num_examples: 100 - name: computer_systems_analyst num_bytes: 3991071.0 num_examples: 100 - name: dental_assistant num_bytes: 2979208.0 num_examples: 100 - name: architect num_bytes: 3890945.0 num_examples: 100 - name: drywall_installer num_bytes: 3579519.0 num_examples: 100 - name: childcare_worker num_bytes: 3586015.0 num_examples: 100 - name: community_manager num_bytes: 3301952.0 num_examples: 100 - name: carpenter num_bytes: 4415058.0 num_examples: 100 - name: claims_appraiser num_bytes: 3836012.0 num_examples: 100 - name: dispatcher num_bytes: 4344042.0 num_examples: 100 - name: cashier num_bytes: 3728570.0 num_examples: 100 - name: detective num_bytes: 3347937.0 num_examples: 100 - name: engineer num_bytes: 3867898.0 num_examples: 100 - name: dishwasher num_bytes: 4831099.0 num_examples: 100 - name: credit_counselor num_bytes: 3139784.0 num_examples: 100 - name: doctor num_bytes: 3124348.0 num_examples: 100 - name: compliance_officer num_bytes: 3471476.0 num_examples: 100 - name: aide num_bytes: 3358153.0 num_examples: 100 - name: bus_driver num_bytes: 4250786.0 num_examples: 100 - name: coach num_bytes: 3644886.0 num_examples: 100 download_size: 75923643 dataset_size: 186125650.0 --- # Dataset Card for "prof_images_blip__22h-vintedois-diffusion-v0-1" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mevol/protein_structure_NER_model_v1.2
--- license: mit language: - en tags: - biology - protein structure - token classification configs: - config_name: protein_structure_NER_model_v1.2 data_files: - split: train path: "annotation_IOB/train.tsv" - split: dev path: "annotation_IOB/dev.tsv" - split: test path: "annotation_IOB/test.tsv" --- ## Overview This data was used to train model: https://huggingface.co/mevol/BiomedNLP-PubMedBERT-ProteinStructure-NER-v1.2 There are 19 different entity types in this dataset: "chemical", "complex_assembly", "evidence", "experimental_method", "gene", "mutant", "oligomeric_state", "protein", "protein_state", "protein_type", "ptm", "residue_name", "residue_name_number","residue_number", "residue_range", "site", "species", "structure_element", "taxonomy_domain" The data prepared as IOB formated input has been used during training, develiopment and testing. Additional data formats such as JSON and XML as well as CSV files are also available and are described below. Annotation was carried out with the free annotation tool TeamTat (https://www.teamtat.org/) and documents were downloaded as BioC XML before converting them to IOB, annotation only JSON and CSV format. The number of annotations and sentences in each file is given below: | document ID | number of annotations in BioC XML | number of annotations in IOB/JSON/CSV | number of sentences | | --- | --- | --- | --- | | PMC4850273 | 1121 | 1121 | 204 | | PMC4784909 | 865 | 865 | 204 | | PMC4850288 | 716 | 708 | 146 | | PMC4887326 | 933 | 933 | 152 | | PMC4833862 | 1044 | 1044 | 192 | | PMC4832331 | 739 | 718 | 134 | | PMC4852598 | 1229 | 1218 | 250 | | PMC4786784 | 1549 | 1549 | 232 | | PMC4848090 | 987 | 985 | 191 | | PMC4792962 | 1268 | 1268 | 256 | | total | 10451 | 10409 | 1961 | Documents and annotations are easiest viewed by using the BioC XML files and opening them in free annotation tool TeamTat (https://www.teamtat.org/). More about the BioC format can be found here: https://bioc.sourceforge.net/ ## Raw BioC XML files These are the raw, un-annotated XML files for the publications in the dataset in BioC format. The files are found in the directory: "raw_BioC_XML". There is one file for each document and they follow standard naming "unique PubMedCentral ID"_raw.xml. ## Annotations in IOB format The IOB formated files can be found in the directory: "annotation_IOB" The four files are as follows: * all.tsv --> all sentences and annotations used to create model "mevol/BiomedNLP-PubMedBERT-ProteinStructure-NER-v1.2"; 1961 sentences * train.tsv --> training subset of the data; 1372 sentences * dev.tsv --> development subset of the data; 294 sentences * test.tsv --> testing subset of the data; 295 sentences The total number of annotations is: 10409 ## Annotations in BioC JSON The BioC formated JSON files of the publications have been downloaded from the annotation tool TeamTat. The files are found in the directory: "annotated_BioC_JSON" There is one file for each document and they follow standard naming "unique PubMedCentral ID"_ann.json Each document JSON contains the following relevant keys: * "sourceid" --> giving the numerical part of the unique PubMedCentral ID * "text" --> containing the complete raw text of the publication as a string * "denotations" --> containing a list of all the annotations for the text Each annotation is a dictionary with the following keys: * "span" --> gives the start and end of the annotatiom span defined by sub keys: * "begin" --> character start position of annotation * "end" --> character end position of annotation * "obj" --> a string containing a number of terms that can be separated by ","; the order of the terms gives the following: entity type, reference to ontology, annotator, time stamp * "id" --> unique annotation ID Here an example: ```json [{"sourceid":"4784909", "sourcedb":"", "project":"", "target":"", "text":"", "denotations":[{"span":{"begin":24, "end":34}, "obj":"chemical,CHEBI:,melaniev@ebi.ac.uk,2023-03-21T15:19:42Z", "id":"4500"}, {"span":{"begin":50, "end":59}, "obj":"taxonomy_domain,DUMMY:,melaniev@ebi.ac.uk,2023-03-21T15:15:03Z", "id":"1281"}] } ] ``` ## Annotations in BioC XML The BioC formated XML files of the publications have been downloaded from the annotation tool TeamTat. The files are found in the directory: "annotated_BioC_XML" There is one file for each document and they follow standard naming "unique PubMedCentral ID_ann.xml The key XML tags to be able to visualise the annotations in TeamTat as well as extracting them to create the training data are "passage" and "offset". The "passage" tag encloses a text passage or paragraph to which the annotations are linked. "Offset" gives the passage/ paragraph offset and allows to determine the character starting and ending postions of the annotations. The tag "text" encloses the raw text of the passage. Each annotation in the XML file is tagged as below: * "annotation id=" --> giving the unique ID of the annotation * "infon key="type"" --> giving the entity type of the annotation * "infon key="identifier"" --> giving a reference to an ontology for the annotation * "infon key="annotator"" --> giving the annotator * "infon key="updated_at"" --> providing a time stamp for annotation creation/update * "location" --> start and end character positions for the annotated text span * "offset" --> start character position as defined by offset value * "length" --> length of the annotation span; sum of "offset" and "length" creates the end character position Here is a basic example of what the BioC XML looks like. Additional tags for document management are not given. Please refer to the documenttation to find out more. ```xml <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE collection SYSTEM "BioC.dtd"> <collection> <source>PMC</source> <date>20140719</date> <key>pmc.key</key> <document> <id>4784909</id> <passage> <offset>0</offset> <text>The Structural Basis of Coenzyme A Recycling in a Bacterial Organelle</text> <annotation id="4500"> <infon key="type">chemical</infon> <infon key="identifier">CHEBI:</infon> <infon key="annotator">melaniev@ebi.ac.uk</infon> <infon key="updated_at">2023-03-21T15:19:42Z</infon> <location offset="24" length="10"/> <text>Coenzyme A</text> </annotation> </passage> </document> </collection> ``` ## Annotations in CSV The annotations and the relevant sentences they have been found in have also been made available as tab-separated CSV files, one for each publication in the dataset. The files can be found in directory "annotation_CSV". Each file is named as "unique PubMedCentral ID".csv. The column labels in the CSV files are as follows: * "anno_start" --> character start position of the annotation * "anno_end" --> character end position of the annotation * "anno_text" --> text covered by the annotation * "entity_type" --> entity type of the annotation * "sentence" --> sentence text in which the annotation was found * "section" --> publication section in which the annotation was found ## Annotations in JSON A combined JSON file was created only containing the relevant sentences and associated annotations for each publication in the dataset. The file can be found in directory "annotation_JSON" under the name "annotations.json". The following keys are used: * "PMC4850273" --> unique PubMedCentral of the publication * "annotations" --> list of dictionaries for the relevant, annotated sentences of the document; each dictionary has the following sub keys * "sid" --> unique sentence ID * "sent" --> sentence text as string * "section" --> publication section the sentence is in * "ner" --> nested list of annotations; each sublist contains the following items: start character position, end character position, annotation text, entity type Here is an example of a sentence and its annotations: ```json {"PMC4850273": {"annotations": [{"sid": 0, "sent": "Molecular Dissection of Xyloglucan Recognition in a Prominent Human Gut Symbiont", "section": "TITLE", "ner": [ [24,34,"Xyloglucan","chemical"], [62,67,"Human","species"],] },] }} ```
shidowake/glaive-code-assistant-v1-sharegpt-format_split_9
--- dataset_info: features: - name: conversations list: - name: from dtype: string - name: value dtype: string splits: - name: train num_bytes: 10503837.603832223 num_examples: 6805 download_size: 5135439 dataset_size: 10503837.603832223 configs: - config_name: default data_files: - split: train path: data/train-* ---
ehristoforu/dalle-3-images
--- license: mit task_categories: - text-to-image - image-to-image tags: - dalle-3 - dall-e - dalle-images - images - croissant size_categories: - 1K<n<10K --- # 🎨 DALL•E 3 Images Dataset This is datase with images made by Dalle3. ## Dataset parameters 1. **Count of images**: 3310 2. **Zip file with dataset**: True 3. **Captions with images**: False ## License License for this dataset: [MIT](https://www.mit.edu/~amini/LICENSE.md) ## Use in *datasets* 1. ```bash pip install -q datasets ``` 2. ```py from datasets import load_dataset dataset = load_dataset( "ehristoforu/dalle-3-images", revision="main" ) ``` #### *Enjoy with this dataset!*
Mrmeneses03/Lucasmodelo
--- license: openrail ---
golaxy/KnowCoder-Schema-Library
--- configs: - config_name: default data_files: - split: schema_library_ner path: schema_pys/Entities.py - split: schema_library_re path: schema_pys/Relations.py - split: schema_library_ee path: schema_pys/Events.py license: apache-2.0 language: - en tags: - schema size_categories: - 1K<n<10K --- <p align="center"> <img src="https://github.com/ICT-GoKnow/ict-goknow.github.io/blob/main/knowcoder/static/images/logo.png?raw=true" width="80"> </p> <h1 align="center"> KnowCoder: Coding Structured Knowledge into LLMs for Universal Information Extraction </h1> <p align="center"> <a href="https://arxiv.org/abs/2403.07969">📃 Paper</a> | <a href="https://huggingface.co/collections/golaxy/knowcoder-65fc3cd385d98567da412abf" >🤗 Resource (Schema • Data • Model)</a> | <a href="https://ict-goknow.github.io/knowcoder/">🚀 Try KnowCoder (coming soon)!</a> </p> # 📖 KnowCoder Schema ### Code-style Schema Representation Method The code-style schema representation method comprises three basic classes, namely, "Entity", "Relation", and "Event". Based on the three basic classes, we represent all the concepts in the schemas by the corresponding classes. Thus, the instances of each concept can be represented by the objects of the corresponding class. A schema consists of class name, class inheritance, class comments, type hint, and class method. The detailed explanation of each component can be found in our paper. <p align="center"> <img src="https://github.com/ICT-GoKnow/ict-goknow.github.io/blob/main/knowcoder/static/images/intro-schema.png?raw=true" style="width: 95%;"> </p> ### Schema Library Construction We construct the code-style schema library under this schema representation method based on Wikidata (Note that we use the Wikidata dump up to 20220704). We select the concepts included in the existing IE datasets created from Wikidata, i.e., [KELM](https://github.com/google-research-datasets/KELM-corpus), [UniversalNER](https://huggingface.co/Universal-NER), [InstructIE](https://huggingface.co/datasets/zjunlp/InstructIE), and [LSEE](https://github.com/acl2017submission/event-data), and derive the constraints among concepts according to their co-occurrences. To construct the taxonomies, we extract the "subclass of" relations among these concepts from Wikidata. To obtain the description of a concept, we use its definition from Wikidata directly or generate its descriptions using GPT-4 if its definition in Wikidata is missing. Finally, the constructed schema library encompasses over 29,177 entity types, 876 relation types, and 519 event types. The detailed statistics of the schema are shown in the following table. Here, "\#Type" denotes the total number of types, "\#Type w/ desc." indicates the count of types with descriptions, and "\#Type w/o desc." signifies the count of types without descriptions. <p align="center"> <img src="https://github.com/ICT-GoKnow/ict-goknow.github.io/blob/main/knowcoder/static/images/schema-library.png?raw=true" style="width: 45%;"> </p>
irds/mr-tydi_ko_train
--- pretty_name: '`mr-tydi/ko/train`' viewer: false source_datasets: ['irds/mr-tydi_ko'] task_categories: - text-retrieval --- # Dataset Card for `mr-tydi/ko/train` The `mr-tydi/ko/train` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package. For more information about the dataset, see the [documentation](https://ir-datasets.com/mr-tydi#mr-tydi/ko/train). # Data This dataset provides: - `queries` (i.e., topics); count=1,295 - `qrels`: (relevance assessments); count=1,317 - For `docs`, use [`irds/mr-tydi_ko`](https://huggingface.co/datasets/irds/mr-tydi_ko) ## Usage ```python from datasets import load_dataset queries = load_dataset('irds/mr-tydi_ko_train', 'queries') for record in queries: record # {'query_id': ..., 'text': ...} qrels = load_dataset('irds/mr-tydi_ko_train', 'qrels') for record in qrels: record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...} ``` Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the data in 🤗 Dataset format. ## Citation Information ``` @article{Zhang2021MrTyDi, title={{Mr. TyDi}: A Multi-lingual Benchmark for Dense Retrieval}, author={Xinyu Zhang and Xueguang Ma and Peng Shi and Jimmy Lin}, year={2021}, journal={arXiv:2108.08787}, } @article{Clark2020TyDiQa, title={{TyDi QA}: A Benchmark for Information-Seeking Question Answering in Typologically Diverse Languages}, author={Jonathan H. Clark and Eunsol Choi and Michael Collins and Dan Garrette and Tom Kwiatkowski and Vitaly Nikolaev and Jennimaria Palomaki}, year={2020}, journal={Transactions of the Association for Computational Linguistics} } ```
fyliao/test
--- license: apache-2.0 ---
FanChen0116/19100_chat_32x_slot_pvi
--- dataset_info: features: - name: id dtype: int64 - name: tokens sequence: string - name: labels sequence: class_label: names: '0': O '1': I-time '2': B-date '3': B-last_name '4': B-people '5': I-date '6': I-people '7': I-last_name '8': I-first_name '9': B-first_name '10': B-time - name: request_slot sequence: string splits: - name: train num_bytes: 191036 num_examples: 1056 - name: validation num_bytes: 5405 num_examples: 32 - name: test num_bytes: 646729 num_examples: 3731 download_size: 34430 dataset_size: 843170 --- # Dataset Card for "19100_chat_32x_slot_pvi" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
liuyanchen1015/MULTI_VALUE_stsb_verbal_ing_suffix
--- dataset_info: features: - name: sentence1 dtype: string - name: sentence2 dtype: string - name: score dtype: float64 - name: idx dtype: int64 - name: value_score dtype: int64 splits: - name: dev num_bytes: 189926 num_examples: 1060 - name: test num_bytes: 139863 num_examples: 910 - name: train num_bytes: 653689 num_examples: 4025 download_size: 610157 dataset_size: 983478 --- # Dataset Card for "MULTI_VALUE_stsb_verbal_ing_suffix" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mHossain/final_train_v4_test_360000
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* dataset_info: features: - name: 'Unnamed: 0' dtype: int64 - name: input_text dtype: string - name: target_text dtype: string - name: prefix dtype: string splits: - name: train num_bytes: 6741504.9 num_examples: 18000 - name: test num_bytes: 749056.1 num_examples: 2000 download_size: 3235727 dataset_size: 7490561.0 --- # Dataset Card for "final_train_v4_test_360000" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
dhiya96/zephyr_text_summarisation_500
--- dataset_info: features: - name: content dtype: string - name: summary dtype: string splits: - name: train num_bytes: 1177833.15 num_examples: 405 - name: test num_bytes: 276281.85 num_examples: 95 download_size: 908924 dataset_size: 1454115.0 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* ---
Lollitor/MyPubChem
--- dataset_info: config_name: Lollitor features: - name: text dtype: string splits: - name: train num_bytes: 34191420 num_examples: 207932 download_size: 7873702 dataset_size: 34191420 configs: - config_name: Lollitor data_files: - split: train path: Lollitor/train-* --- # Dataset Card for "MyPubChem" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
multi-train/medmcqa_1107
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: query dtype: string - name: pos sequence: string - name: neg sequence: string - name: task dtype: string - name: instruction struct: - name: query dtype: string - name: pos dtype: string - name: neg dtype: string splits: - name: train num_bytes: 194641944 num_examples: 160869 download_size: 102313307 dataset_size: 194641944 --- # Dataset Card for "medmcqa_1107" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Svenni551/toxic-full-uncensored-v2.0
--- dataset_info: features: - name: prompt dtype: string - name: output dtype: string - name: response dtype: string splits: - name: train num_bytes: 1665099 num_examples: 564 download_size: 851965 dataset_size: 1665099 configs: - config_name: default data_files: - split: train path: data/train-* ---
open-llm-leaderboard/details_ChaoticNeutrals__Bepis_9B
--- pretty_name: Evaluation run of ChaoticNeutrals/Bepis_9B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [ChaoticNeutrals/Bepis_9B](https://huggingface.co/ChaoticNeutrals/Bepis_9B) on\ \ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ChaoticNeutrals__Bepis_9B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-03-04T07:04:07.439007](https://huggingface.co/datasets/open-llm-leaderboard/details_ChaoticNeutrals__Bepis_9B/blob/main/results_2024-03-04T07-04-07.439007.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6257641151601208,\n\ \ \"acc_stderr\": 0.03281244340148477,\n \"acc_norm\": 0.6313208338615179,\n\ \ \"acc_norm_stderr\": 0.03347537782366629,\n \"mc1\": 0.3708690330477356,\n\ \ \"mc1_stderr\": 0.01690969358024883,\n \"mc2\": 0.5330051404428096,\n\ \ \"mc2_stderr\": 0.015225773578936268\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5819112627986348,\n \"acc_stderr\": 0.014413988396996077,\n\ \ \"acc_norm\": 0.6254266211604096,\n \"acc_norm_stderr\": 0.014144193471893449\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.612427803226449,\n\ \ \"acc_stderr\": 0.004862003566798541,\n \"acc_norm\": 0.8012348137821151,\n\ \ \"acc_norm_stderr\": 0.003982553164086263\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\ \ \"acc_stderr\": 0.042849586397534015,\n \"acc_norm\": 0.562962962962963,\n\ \ \"acc_norm_stderr\": 0.042849586397534015\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\ \ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\ \ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \ \ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n\ \ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\ \ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\ \ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \ \ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\ : 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\ \ \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.6069364161849711,\n\ \ \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.4803921568627451,\n \"acc_stderr\": 0.04971358884367406,\n\ \ \"acc_norm\": 0.4803921568627451,\n \"acc_norm_stderr\": 0.04971358884367406\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n\ \ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.032600385118357715,\n\ \ \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.032600385118357715\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\ \ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\ \ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\ \ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944433,\n \"\ acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944433\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\ \ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\ \ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.7548387096774194,\n \"acc_stderr\": 0.024472243840895514,\n \"\ acc_norm\": 0.7548387096774194,\n \"acc_norm_stderr\": 0.024472243840895514\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n \"\ acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\ : 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\ \ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"\ acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n\ \ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6256410256410256,\n \"acc_stderr\": 0.024537591572830506,\n\ \ \"acc_norm\": 0.6256410256410256,\n \"acc_norm_stderr\": 0.024537591572830506\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968351,\n \ \ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968351\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \ \ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3973509933774834,\n \"acc_stderr\": 0.0399552400768168,\n \"acc_norm\"\ : 0.3973509933774834,\n \"acc_norm_stderr\": 0.0399552400768168\n },\n\ \ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8275229357798165,\n\ \ \"acc_stderr\": 0.016197807956848047,\n \"acc_norm\": 0.8275229357798165,\n\ \ \"acc_norm_stderr\": 0.016197807956848047\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\ : {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n\ \ \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849313,\n \"\ acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849313\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \ \ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\ \ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\ \ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\ \ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\ acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\ \ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\ \ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n\ \ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\ \ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \ \ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.041858325989283136,\n\ \ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.041858325989283136\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\ \ \"acc_stderr\": 0.022509033937077795,\n \"acc_norm\": 0.8632478632478633,\n\ \ \"acc_norm_stderr\": 0.022509033937077795\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7956577266922095,\n\ \ \"acc_stderr\": 0.014419123980931894,\n \"acc_norm\": 0.7956577266922095,\n\ \ \"acc_norm_stderr\": 0.014419123980931894\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.02447699407624734,\n\ \ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.02447699407624734\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41564245810055866,\n\ \ \"acc_stderr\": 0.016482782187500666,\n \"acc_norm\": 0.41564245810055866,\n\ \ \"acc_norm_stderr\": 0.016482782187500666\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.026716118380156847,\n\ \ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.026716118380156847\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\ \ \"acc_stderr\": 0.026385273703464482,\n \"acc_norm\": 0.684887459807074,\n\ \ \"acc_norm_stderr\": 0.026385273703464482\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495026,\n\ \ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495026\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \ \ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45436766623207303,\n\ \ \"acc_stderr\": 0.012716941720734811,\n \"acc_norm\": 0.45436766623207303,\n\ \ \"acc_norm_stderr\": 0.012716941720734811\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n\ \ \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6372549019607843,\n \"acc_stderr\": 0.019450768432505518,\n \ \ \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.019450768432505518\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\ \ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\ \ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065677,\n\ \ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065677\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\ \ \"acc_stderr\": 0.027403859410786848,\n \"acc_norm\": 0.8159203980099502,\n\ \ \"acc_norm_stderr\": 0.027403859410786848\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \ \ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\ \ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\ \ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\ \ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3708690330477356,\n\ \ \"mc1_stderr\": 0.01690969358024883,\n \"mc2\": 0.5330051404428096,\n\ \ \"mc2_stderr\": 0.015225773578936268\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7647987371744278,\n \"acc_stderr\": 0.011920008163650877\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3912054586808188,\n \ \ \"acc_stderr\": 0.013442502402794302\n }\n}\n```" repo_url: https://huggingface.co/ChaoticNeutrals/Bepis_9B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|arc:challenge|25_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-03-04T07-04-07.439007.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|gsm8k|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hellaswag|10_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-04T07-04-07.439007.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-management|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-virology|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T07-04-07.439007.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|truthfulqa:mc|0_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-03-04T07-04-07.439007.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_03_04T07_04_07.439007 path: - '**/details_harness|winogrande|5_2024-03-04T07-04-07.439007.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-03-04T07-04-07.439007.parquet' - config_name: results data_files: - split: 2024_03_04T07_04_07.439007 path: - results_2024-03-04T07-04-07.439007.parquet - split: latest path: - results_2024-03-04T07-04-07.439007.parquet --- # Dataset Card for Evaluation run of ChaoticNeutrals/Bepis_9B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [ChaoticNeutrals/Bepis_9B](https://huggingface.co/ChaoticNeutrals/Bepis_9B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_ChaoticNeutrals__Bepis_9B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-03-04T07:04:07.439007](https://huggingface.co/datasets/open-llm-leaderboard/details_ChaoticNeutrals__Bepis_9B/blob/main/results_2024-03-04T07-04-07.439007.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6257641151601208, "acc_stderr": 0.03281244340148477, "acc_norm": 0.6313208338615179, "acc_norm_stderr": 0.03347537782366629, "mc1": 0.3708690330477356, "mc1_stderr": 0.01690969358024883, "mc2": 0.5330051404428096, "mc2_stderr": 0.015225773578936268 }, "harness|arc:challenge|25": { "acc": 0.5819112627986348, "acc_stderr": 0.014413988396996077, "acc_norm": 0.6254266211604096, "acc_norm_stderr": 0.014144193471893449 }, "harness|hellaswag|10": { "acc": 0.612427803226449, "acc_stderr": 0.004862003566798541, "acc_norm": 0.8012348137821151, "acc_norm_stderr": 0.003982553164086263 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.562962962962963, "acc_stderr": 0.042849586397534015, "acc_norm": 0.562962962962963, "acc_norm_stderr": 0.042849586397534015 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6907894736842105, "acc_stderr": 0.037610708698674805, "acc_norm": 0.6907894736842105, "acc_norm_stderr": 0.037610708698674805 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6943396226415094, "acc_stderr": 0.028353298073322666, "acc_norm": 0.6943396226415094, "acc_norm_stderr": 0.028353298073322666 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7222222222222222, "acc_stderr": 0.037455547914624555, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.037455547914624555 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6069364161849711, "acc_stderr": 0.0372424959581773, "acc_norm": 0.6069364161849711, "acc_norm_stderr": 0.0372424959581773 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4803921568627451, "acc_stderr": 0.04971358884367406, "acc_norm": 0.4803921568627451, "acc_norm_stderr": 0.04971358884367406 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.72, "acc_stderr": 0.045126085985421276, "acc_norm": 0.72, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5361702127659574, "acc_stderr": 0.032600385118357715, "acc_norm": 0.5361702127659574, "acc_norm_stderr": 0.032600385118357715 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4473684210526316, "acc_stderr": 0.04677473004491199, "acc_norm": 0.4473684210526316, "acc_norm_stderr": 0.04677473004491199 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5586206896551724, "acc_stderr": 0.04137931034482758, "acc_norm": 0.5586206896551724, "acc_norm_stderr": 0.04137931034482758 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41005291005291006, "acc_stderr": 0.025331202438944433, "acc_norm": 0.41005291005291006, "acc_norm_stderr": 0.025331202438944433 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.42063492063492064, "acc_stderr": 0.04415438226743744, "acc_norm": 0.42063492063492064, "acc_norm_stderr": 0.04415438226743744 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7548387096774194, "acc_stderr": 0.024472243840895514, "acc_norm": 0.7548387096774194, "acc_norm_stderr": 0.024472243840895514 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.47783251231527096, "acc_stderr": 0.035145285621750094, "acc_norm": 0.47783251231527096, "acc_norm_stderr": 0.035145285621750094 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7575757575757576, "acc_stderr": 0.03346409881055953, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.03346409881055953 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7929292929292929, "acc_stderr": 0.02886977846026705, "acc_norm": 0.7929292929292929, "acc_norm_stderr": 0.02886977846026705 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8601036269430051, "acc_stderr": 0.025033870583015178, "acc_norm": 0.8601036269430051, "acc_norm_stderr": 0.025033870583015178 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6256410256410256, "acc_stderr": 0.024537591572830506, "acc_norm": 0.6256410256410256, "acc_norm_stderr": 0.024537591572830506 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3111111111111111, "acc_stderr": 0.02822644674968351, "acc_norm": 0.3111111111111111, "acc_norm_stderr": 0.02822644674968351 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.680672268907563, "acc_stderr": 0.030283995525884396, "acc_norm": 0.680672268907563, "acc_norm_stderr": 0.030283995525884396 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3973509933774834, "acc_stderr": 0.0399552400768168, "acc_norm": 0.3973509933774834, "acc_norm_stderr": 0.0399552400768168 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8275229357798165, "acc_stderr": 0.016197807956848047, "acc_norm": 0.8275229357798165, "acc_norm_stderr": 0.016197807956848047 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5092592592592593, "acc_stderr": 0.034093869469927006, "acc_norm": 0.5092592592592593, "acc_norm_stderr": 0.034093869469927006 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7843137254901961, "acc_stderr": 0.028867431449849313, "acc_norm": 0.7843137254901961, "acc_norm_stderr": 0.028867431449849313 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.759493670886076, "acc_stderr": 0.027820781981149685, "acc_norm": 0.759493670886076, "acc_norm_stderr": 0.027820781981149685 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6547085201793722, "acc_stderr": 0.03191100192835794, "acc_norm": 0.6547085201793722, "acc_norm_stderr": 0.03191100192835794 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7404580152671756, "acc_stderr": 0.03844876139785271, "acc_norm": 0.7404580152671756, "acc_norm_stderr": 0.03844876139785271 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7407407407407407, "acc_stderr": 0.04236511258094633, "acc_norm": 0.7407407407407407, "acc_norm_stderr": 0.04236511258094633 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7177914110429447, "acc_stderr": 0.03536117886664742, "acc_norm": 0.7177914110429447, "acc_norm_stderr": 0.03536117886664742 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5, "acc_stderr": 0.04745789978762494, "acc_norm": 0.5, "acc_norm_stderr": 0.04745789978762494 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.041858325989283136, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.041858325989283136 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8632478632478633, "acc_stderr": 0.022509033937077795, "acc_norm": 0.8632478632478633, "acc_norm_stderr": 0.022509033937077795 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7956577266922095, "acc_stderr": 0.014419123980931894, "acc_norm": 0.7956577266922095, "acc_norm_stderr": 0.014419123980931894 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.708092485549133, "acc_stderr": 0.02447699407624734, "acc_norm": 0.708092485549133, "acc_norm_stderr": 0.02447699407624734 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.41564245810055866, "acc_stderr": 0.016482782187500666, "acc_norm": 0.41564245810055866, "acc_norm_stderr": 0.016482782187500666 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6797385620915033, "acc_stderr": 0.026716118380156847, "acc_norm": 0.6797385620915033, "acc_norm_stderr": 0.026716118380156847 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.684887459807074, "acc_stderr": 0.026385273703464482, "acc_norm": 0.684887459807074, "acc_norm_stderr": 0.026385273703464482 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7098765432098766, "acc_stderr": 0.025251173936495026, "acc_norm": 0.7098765432098766, "acc_norm_stderr": 0.025251173936495026 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.49645390070921985, "acc_stderr": 0.02982674915328092, "acc_norm": 0.49645390070921985, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.45436766623207303, "acc_stderr": 0.012716941720734811, "acc_norm": 0.45436766623207303, "acc_norm_stderr": 0.012716941720734811 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6397058823529411, "acc_stderr": 0.029163128570670733, "acc_norm": 0.6397058823529411, "acc_norm_stderr": 0.029163128570670733 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6372549019607843, "acc_stderr": 0.019450768432505518, "acc_norm": 0.6372549019607843, "acc_norm_stderr": 0.019450768432505518 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7020408163265306, "acc_stderr": 0.029279567411065677, "acc_norm": 0.7020408163265306, "acc_norm_stderr": 0.029279567411065677 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8159203980099502, "acc_stderr": 0.027403859410786848, "acc_norm": 0.8159203980099502, "acc_norm_stderr": 0.027403859410786848 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.03487350880197769, "acc_norm": 0.86, "acc_norm_stderr": 0.03487350880197769 }, "harness|hendrycksTest-virology|5": { "acc": 0.536144578313253, "acc_stderr": 0.038823108508905954, "acc_norm": 0.536144578313253, "acc_norm_stderr": 0.038823108508905954 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.029170885500727665, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.029170885500727665 }, "harness|truthfulqa:mc|0": { "mc1": 0.3708690330477356, "mc1_stderr": 0.01690969358024883, "mc2": 0.5330051404428096, "mc2_stderr": 0.015225773578936268 }, "harness|winogrande|5": { "acc": 0.7647987371744278, "acc_stderr": 0.011920008163650877 }, "harness|gsm8k|5": { "acc": 0.3912054586808188, "acc_stderr": 0.013442502402794302 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
autoevaluate/autoeval-eval-chintagunta85__ncbi_disease-ncbi_disease-f4d843-3192989822
--- type: predictions tags: - autotrain - evaluation datasets: - chintagunta85/ncbi_disease eval_info: task: entity_extraction model: sschet/biobert_diseases_ner metrics: [] dataset_name: chintagunta85/ncbi_disease dataset_config: ncbi_disease dataset_split: test col_mapping: tokens: tokens tags: ner_tags --- # Dataset Card for AutoTrain Evaluator This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset: * Task: Token Classification * Model: sschet/biobert_diseases_ner * Dataset: chintagunta85/ncbi_disease * Config: ncbi_disease * Split: test To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator). ## Contributions Thanks to [@sschet](https://huggingface.co/sschet) for evaluating this model.
VineelPhillips/embedding
--- license: wtfpl ---
whu9/wiki_20220301flatten
--- dataset_info: features: - name: sentences dtype: string splits: - name: train num_bytes: 1224908657 num_examples: 22874571 download_size: 778259253 dataset_size: 1224908657 --- # Dataset Card for "wiki_20220301flatten" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
irwan19/test
--- license: mit ---
Seongill/NQ_missing_5_masked
--- dataset_info: features: - name: question dtype: string - name: answers sequence: string - name: ctxs list: - name: hasanswer dtype: bool - name: id dtype: string - name: score dtype: float64 - name: text dtype: string - name: title dtype: string - name: has_answer dtype: bool - name: masked_query dtype: string - name: query_embedding sequence: float32 splits: - name: train num_bytes: 23378788 num_examples: 3610 download_size: 20867398 dataset_size: 23378788 configs: - config_name: default data_files: - split: train path: data/train-* ---
promptora11/llama2
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 338808 num_examples: 200 download_size: 201257 dataset_size: 338808 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "llama2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Asomaa/demo
--- license: apache-2.0 ---
swaroopajit/next-dataset-refined-batch-4000
--- dataset_info: features: - name: caption dtype: string - name: image dtype: image splits: - name: train num_bytes: 316595519.0 num_examples: 1000 download_size: 289227918 dataset_size: 316595519.0 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "next-dataset-refined-batch-4000" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
GEM-submissions/lewtun__hugging-face-test-t5-base.outputs.json-36bf2a59__1646052073
--- benchmark: gem type: prediction submission_name: Hugging Face test T5-base.outputs.json 36bf2a59 tags: - evaluation - benchmark --- # GEM Submission Submission name: Hugging Face test T5-base.outputs.json 36bf2a59
airaspberry/sweater-cads
--- license: openrail ---
taesiri/notre-arte
--- dataset_info: features: - name: image dtype: image splits: - name: validation num_bytes: 670571853.848 num_examples: 3831 download_size: 739298076 dataset_size: 670571853.848 license: cc-by-4.0 task_categories: - image-to-text tags: - art size_categories: - 1K<n<10K --- # Notre Arte Image Dataset ## Description This dataset comprises images sourced from the [Notre Arte Instagram page](https://www.instagram.com/notre.arte/) and is intended to serve as a challenging and intriguing dataset for testing visual language models and large multimodal language models. The images in this dataset are characterized by their unique artistic style and complexity, which can provide a robust test for the capabilities of modern AI models. ## Usage This dataset is intended for research purposes, specifically the evaluation of visual and multimodal language models. ## Structure - Each entry in the dataset is an image without any annotation or category. ## License This dataset is made available under a [Creative Commons Attribution 4.0 International License](https://creativecommons.org/licenses/by/4.0/).
mmenendezg/raw_pneumonia_x_ray
--- dataset_info: features: - name: image dtype: image - name: label dtype: class_label: names: '0': normal '1': pneumonia splits: - name: train num_bytes: 3197295656.864 num_examples: 5232 - name: test num_bytes: 111133345.0 num_examples: 624 download_size: 1263131512 dataset_size: 3308429001.864 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* --- # Dataset Card for "raw_pneumonia_x_ray" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
benmainbird/prompt_answers_v1
--- language: - en pretty_name: "Open Prompt LLM Answers" tags: - llm - prompts - answers --- # Dataset Card for Open Prompt Answers ## Dataset Summary This dataset provides answers from different Large Language models to prompts from several public datasets. + `prompt`: a prompt from an open-source dataset + `prompt_origin`: the dataset the prompt is taken from + `Llama-2-7b-chat-hf_output`: output generation of [meta-llama/Llama-2-7b-chat-hf](https://huggingface.co/meta-llama/Llama-2-7b-chat-hf) model + `Llama-2-7b-chat-hf_generation_time`: generation duration *in seconds* for the answer of [meta-llama/Llama-2-7b-chat-hf](https://huggingface.co/meta-llama/Llama-2-7b-chat-hf) model + `oasst-sft-4-pythia-12b_output`: output generation of [OpenAssistant/oasst-sft-4-pythia-12b-epoch-3.5](https://huggingface.co/OpenAssistant/oasst-sft-4-pythia-12b-epoch-3.5) model + `oasst-sft-4-pythia-12b_generation_time`: generation duration *in seconds* for the answer of [OpenAssistant/oasst-sft-4-pythia-12b-epoch-3.5](https://huggingface.co/OpenAssistant/oasst-sft-4-pythia-12b-epoch-3.5) model + `vicuna-7b-v1.5_output`: output generation of [lmsys/vicuna-7b-v1.5](https://huggingface.co/lmsys/vicuna-7b-v1.5) model + `vicuna-7b-v1.5_generation_time`: generation duration *in seconds* for the answer of [lmsys/vicuna-7b-v1.5](https://huggingface.co/lmsys/vicuna-7b-v1.5) model ## Prompt Sources The prompts are a subset of all prompts of the following datasets: + [OpenAssistant/oasst1](https://huggingface.co/datasets/OpenAssistant/oasst1): only english prompts with no previous conversation tree (`role = prompter` and `parent_id = null`) + [Anthropic/hh-rlhf](https://huggingface.co/datasets/Anthropic/hh-rlhf): only the initial input of the *Human* as prompt + [tatsu-lab/alpaca](https://huggingface.co/datasets/tatsu-lab/alpaca): concatenated `instruction` and `input` to form prompt + [Dahoas/synthetic-instruct-gptj-pairwise](https://huggingface.co/datasets/Dahoas/synthetic-instruct-gptj-pairwise): prompts from `prompt` column ## Output Generation The configuration is the same for each model: + `temperature`: 0.7 + `max_new_tokens`: 512 + `repetition_penalty`: 1.0 The generation duration is provided (in seconds).
open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged
--- pretty_name: Evaluation run of dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged](https://huggingface.co/dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-23T02:32:29.889324](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged/blob/main/results_2023-10-23T02-32-29.889324.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.012164429530201342,\n\ \ \"em_stderr\": 0.0011226072817371853,\n \"f1\": 0.07720742449664415,\n\ \ \"f1_stderr\": 0.0018320825904246663,\n \"acc\": 0.3909059684425251,\n\ \ \"acc_stderr\": 0.009118223911065027\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.012164429530201342,\n \"em_stderr\": 0.0011226072817371853,\n\ \ \"f1\": 0.07720742449664415,\n \"f1_stderr\": 0.0018320825904246663\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04700530705079606,\n \ \ \"acc_stderr\": 0.005829898355937193\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7348066298342542,\n \"acc_stderr\": 0.01240654946619286\n\ \ }\n}\n```" repo_url: https://huggingface.co/dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|arc:challenge|25_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|arc:challenge|25_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-09-05T11:32:06.887851.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_23T02_32_29.889324 path: - '**/details_harness|drop|3_2023-10-23T02-32-29.889324.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-23T02-32-29.889324.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_23T02_32_29.889324 path: - '**/details_harness|gsm8k|5_2023-10-23T02-32-29.889324.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-23T02-32-29.889324.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hellaswag|10_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hellaswag|10_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-management|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-08-31T20:14:35.728415.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-05T11:32:06.887851.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-international_law|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-management|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-management|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-marketing|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-sociology|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-virology|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-virology|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T11:32:06.887851.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_08_31T20_14_35.728415 path: - '**/details_harness|truthfulqa:mc|0_2023-08-31T20:14:35.728415.parquet' - split: 2023_09_05T11_32_06.887851 path: - '**/details_harness|truthfulqa:mc|0_2023-09-05T11:32:06.887851.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-09-05T11:32:06.887851.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_23T02_32_29.889324 path: - '**/details_harness|winogrande|5_2023-10-23T02-32-29.889324.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-23T02-32-29.889324.parquet' - config_name: results data_files: - split: 2023_08_31T20_14_35.728415 path: - results_2023-08-31T20:14:35.728415.parquet - split: 2023_09_05T11_32_06.887851 path: - results_2023-09-05T11:32:06.887851.parquet - split: 2023_10_23T02_32_29.889324 path: - results_2023-10-23T02-32-29.889324.parquet - split: latest path: - results_2023-10-23T02-32-29.889324.parquet --- # Dataset Card for Evaluation run of dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged](https://huggingface.co/dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-23T02:32:29.889324](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged/blob/main/results_2023-10-23T02-32-29.889324.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.012164429530201342, "em_stderr": 0.0011226072817371853, "f1": 0.07720742449664415, "f1_stderr": 0.0018320825904246663, "acc": 0.3909059684425251, "acc_stderr": 0.009118223911065027 }, "harness|drop|3": { "em": 0.012164429530201342, "em_stderr": 0.0011226072817371853, "f1": 0.07720742449664415, "f1_stderr": 0.0018320825904246663 }, "harness|gsm8k|5": { "acc": 0.04700530705079606, "acc_stderr": 0.005829898355937193 }, "harness|winogrande|5": { "acc": 0.7348066298342542, "acc_stderr": 0.01240654946619286 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
ricardo-filho/tweets_pt_sentiment_analysis
--- dataset_info: features: - name: id dtype: int64 - name: text dtype: string - name: label dtype: int64 splits: - name: train num_bytes: 80942752 num_examples: 819346 - name: validation num_bytes: 824575 num_examples: 8360 - name: test num_bytes: 814723 num_examples: 8360 download_size: 61192823 dataset_size: 82582050 --- # Dataset Card for "tweets_pt_sentiment_analysis" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_FabbriSimo01__Bloom_1b_Quantized
--- pretty_name: Evaluation run of FabbriSimo01/Bloom_1b_Quantized dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [FabbriSimo01/Bloom_1b_Quantized](https://huggingface.co/FabbriSimo01/Bloom_1b_Quantized)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 3 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FabbriSimo01__Bloom_1b_Quantized\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-09-17T14:41:55.154995](https://huggingface.co/datasets/open-llm-leaderboard/details_FabbriSimo01__Bloom_1b_Quantized/blob/main/results_2023-09-17T14-41-55.154995.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0016778523489932886,\n\ \ \"em_stderr\": 0.00041913301788268413,\n \"f1\": 0.047125629194631036,\n\ \ \"f1_stderr\": 0.0012660847237774002,\n \"acc\": 0.27897440899296483,\n\ \ \"acc_stderr\": 0.007517237128084831\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.00041913301788268413,\n\ \ \"f1\": 0.047125629194631036,\n \"f1_stderr\": 0.0012660847237774002\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.001516300227445034,\n \ \ \"acc_stderr\": 0.0010717793485492627\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.5564325177584846,\n \"acc_stderr\": 0.0139626949076204\n\ \ }\n}\n```" repo_url: https://huggingface.co/FabbriSimo01/Bloom_1b_Quantized leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_drop_3 data_files: - split: 2023_09_17T14_41_55.154995 path: - '**/details_harness|drop|3_2023-09-17T14-41-55.154995.parquet' - split: latest path: - '**/details_harness|drop|3_2023-09-17T14-41-55.154995.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_09_17T14_41_55.154995 path: - '**/details_harness|gsm8k|5_2023-09-17T14-41-55.154995.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-09-17T14-41-55.154995.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_09_17T14_41_55.154995 path: - '**/details_harness|winogrande|5_2023-09-17T14-41-55.154995.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-09-17T14-41-55.154995.parquet' - config_name: results data_files: - split: 2023_09_17T14_41_55.154995 path: - results_2023-09-17T14-41-55.154995.parquet - split: latest path: - results_2023-09-17T14-41-55.154995.parquet --- # Dataset Card for Evaluation run of FabbriSimo01/Bloom_1b_Quantized ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/FabbriSimo01/Bloom_1b_Quantized - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [FabbriSimo01/Bloom_1b_Quantized](https://huggingface.co/FabbriSimo01/Bloom_1b_Quantized) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_FabbriSimo01__Bloom_1b_Quantized", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T14:41:55.154995](https://huggingface.co/datasets/open-llm-leaderboard/details_FabbriSimo01__Bloom_1b_Quantized/blob/main/results_2023-09-17T14-41-55.154995.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0016778523489932886, "em_stderr": 0.00041913301788268413, "f1": 0.047125629194631036, "f1_stderr": 0.0012660847237774002, "acc": 0.27897440899296483, "acc_stderr": 0.007517237128084831 }, "harness|drop|3": { "em": 0.0016778523489932886, "em_stderr": 0.00041913301788268413, "f1": 0.047125629194631036, "f1_stderr": 0.0012660847237774002 }, "harness|gsm8k|5": { "acc": 0.001516300227445034, "acc_stderr": 0.0010717793485492627 }, "harness|winogrande|5": { "acc": 0.5564325177584846, "acc_stderr": 0.0139626949076204 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
C-MTEB/MMarcoRetrieval
--- configs: - config_name: default data_files: - split: corpus path: data/corpus-* - split: queries path: data/queries-* dataset_info: features: - name: id dtype: string - name: text dtype: string splits: - name: corpus num_bytes: 32552468 num_examples: 106813 - name: queries num_bytes: 303316 num_examples: 6980 download_size: 20422289 dataset_size: 32855784 --- # Dataset Card for "MMarcoRetrieval" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
georgeprethesh/1
--- license: openrail ---
jitx/distillation_code_100
--- dataset_info: features: - name: santacoder_prompts dtype: string - name: fim_inputs dtype: string - name: label_middles dtype: string - name: santacoder_outputs dtype: string - name: openai_rationales dtype: string splits: - name: train num_bytes: 399654 num_examples: 100 download_size: 155882 dataset_size: 399654 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "distillation_code_100" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
one-sec-cv12/chunk_213
--- dataset_info: features: - name: audio dtype: audio: sampling_rate: 16000 splits: - name: train num_bytes: 23810491296.25 num_examples: 247902 download_size: 22645976382 dataset_size: 23810491296.25 --- # Dataset Card for "chunk_213" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
m-elio/spell_generation
--- license: gfdl language: - en --- # Dataset Card for D&D 5th Edition Spells ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> This dataset contains spells that were obtained by the Homebrew section of the [D&D Wiki](https://www.dandwiki.com/wiki/Main_Page). The spells were filtered and processed according to the following procedure: - Removed spells not having one of the following tags: *Level*, *School*, *Duration*, *Casting time*, *Range* - Removed spells not correctly specifying the required components (that is "V", "S" or "M") - Removed spells associated to non-official classes - Removed "votes" section that could be present in the spell description The spells were then formatted following this format: ``` Name: Level: School: Classes: Casting time: Range: Duration: Components: [If no components are required, then this field has a None value] Material cost: [If there is no "M" character in the Components field, then this field is skipped] Description: ``` - **Language(s) (NLP):** English - **License:** gfdl
autoevaluate/autoeval-eval-squadshifts-amazon-74b272-2017966729
--- type: predictions tags: - autotrain - evaluation datasets: - squadshifts eval_info: task: extractive_question_answering model: deepset/roberta-base-squad2 metrics: [] dataset_name: squadshifts dataset_config: amazon dataset_split: test col_mapping: context: context question: question answers-text: answers.text answers-answer_start: answers.answer_start --- # Dataset Card for AutoTrain Evaluator This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset: * Task: Question Answering * Model: deepset/roberta-base-squad2 * Dataset: squadshifts * Config: amazon * Split: test To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator). ## Contributions Thanks to [@viralshanker](https://huggingface.co/viralshanker) for evaluating this model.
micsell/hebrew_keywords
--- dataset_info: features: - name: audio dtype: audio - name: label dtype: class_label: names: '0': daateh '1': hait '2': higat '3': hona '4': itah '5': lah '6': otah '7': shelah splits: - name: train num_bytes: 67028933.331 num_examples: 5353 download_size: 66602446 dataset_size: 67028933.331 configs: - config_name: default data_files: - split: train path: data/train-* ---
5CD-AI/Vietnamese-ShareGPT4Vision-gg-translated
--- task_categories: - visual-question-answering - question-answering language: - en - vi size_categories: - 100K<n<1M tags: - multi-modal - image-captioning - vqa - sharegpt4v - GPT4-Vision ---
HuggingFaceH4/helpful-instructions
--- license: apache-2.0 tags: - human-feedback pretty_name: Helpful Instructions --- # Dataset Card for Helpful Instructions ## Dataset Description - **Homepage:** - **Repository:** - **Paper:** - **Leaderboard:** - **Point of Contact: Lewis Tunstall** ### Dataset Summary Helpful Instructions is a dataset of `(instruction, demonstration)` pairs that are derived from public datasets. As the name suggests, it focuses on instructions that are "helpful", i.e. the kind of questions or tasks a human user might instruct an AI assistant to perform. You can load the dataset as follows: ```python from datasets import load_dataset # Load all subsets helpful_instructions = load_dataset("HuggingFaceH4/helpful_instructions") # Load a single subset helpful_instructions_subset = load_dataset("HuggingFaceH4/helpful_instructions", data_dir="data/helpful-anthropic-raw") ``` ### Supported Tasks and Leaderboards This dataset can be used to fine-tune pretrained language models to follow instructions. ### Languages English ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
FanChen0116/syn_few7_7100_chat_all_data_pvi
--- dataset_info: features: - name: id dtype: int64 - name: tokens sequence: string - name: labels sequence: class_label: names: '0': O '1': I-time '2': B-date '3': B-last_name '4': B-people '5': I-date '6': I-people '7': I-last_name '8': I-first_name '9': B-first_name '10': B-time - name: request_slot sequence: string splits: - name: train num_bytes: 558759 num_examples: 3335 - name: validation num_bytes: 646729 num_examples: 3731 - name: test num_bytes: 646729 num_examples: 3731 download_size: 92716 dataset_size: 1852217 --- # Dataset Card for "syn_few7_7100_chat_all_data_pvi" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
HazySkies/SV4-M-ZP
--- language: - en --- **Zipped alternative dataset for use with some colabs, and ease of download in some cases.** A hoof-ful of SoVits 4.0 (so-vits-svc 4.0) model contributions provided for the Pony Preservation Project<br>*Assume trained for speaking unless otherwise specified — Most are still fully capable of singing* **This dataset consists of dominantly mares:** <br> \>MLP:FiM canon characters<br> \>MLP fandom characters <br>**And adjacent:** <br>\>Them's Fightin' Herds<br>\>Other voiced equines **[AUG 23] Initial Models:** <br>Athena (Shawn Keller) - 25k steps<br>Cadance (FiM) - 25k steps<br>Saffron Masala (FiM) - 15k steps<br>Shining Armor (FiM) - 25k steps<br>Arizona (TFH) - 20k steps<br>Velvet (TFH) - 20k steps<br>Derpy Hooves (FiM) - 69k steps <br> <br> <br> Unzipped version of dataset here: https://huggingface.co/datasets/HazySkies/SV4-M/
weqweasdas/preference_dataset_mixture2_and_safe_pku150k
--- dataset_info: features: - name: chosen_score dtype: float64 - name: rejected list: - name: content dtype: string - name: role dtype: string - name: rejected_score dtype: float64 - name: chosen list: - name: content dtype: string - name: role dtype: string splits: - name: train num_bytes: 1935063763 num_examples: 678029 download_size: 1101844862 dataset_size: 1935063763 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "preference_dataset_mixture2_and_safe_pku150k" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
AIML-TUDA/i2p-adversarial-split
--- license: mit --- # I2P - Adversarial Samples We here provide a subset of the inappropriate image prompts (I2P) benchmark that are solid candidates for adversarial testing. Specifically, all prompts in this dataset provided here are reasonably likely to produce inappropriate images and bypass the MidJourney prompt filter. More details are provided in our AACL workshop paper: ["Distilling Adversarial Prompts from Safety Benchmarks: Report for the Adversarial Nibbler Challenge"](https://arxiv.org/abs/2309.11575)
TrainingDataPro/2d-masks-presentation-attack-detection
--- language: - en license: cc-by-nc-nd-4.0 task_categories: - image-classification tags: - code dataset_info: features: - name: user dtype: string - name: real_1 dtype: string - name: real_2 dtype: string - name: real_3 dtype: string - name: real_4 dtype: string - name: mask_1 dtype: string - name: mask_2 dtype: string - name: mask_3 dtype: string - name: mask_4 dtype: string - name: cut_1 dtype: string - name: cut_2 dtype: string - name: cut_3 dtype: string - name: cut_4 dtype: string splits: - name: train num_bytes: 4607 num_examples: 17 download_size: 901061924 dataset_size: 4607 --- # 2D Masks Presentation Attack Detection The dataset consists of videos of individuals wearing printed 2D masks or printed 2D masks with cut-out eyes and directly looking at the camera. Videos are filmed in different lightning conditions and in different places (*indoors, outdoors*). Each video in the dataset has an approximate duration of 2 seconds. ### Types of videos in the dataset - **real** - 4 videos of the person without a mask. - **mask** - 4 videos of the person wearing a printed 2D mask. - **cut** - 4 videos of the person wearing a printed 2D mask with cut-out holes for eyes. ![](https://www.googleapis.com/download/storage/v1/b/kaggle-user-content/o/inbox%2F12421376%2Fd29be8e22b3376efc1260f0a90f66d5c%2FMacBook%20Air%20-%201%20(2).png?generation=1690460078319549&alt=media) People in the dataset wear different accessorieses, such as *glasses, caps, scarfs, hats and masks*. Most of them are worn over a mask, however *glasses and masks* can be are also printed on the mask itself. ![](https://www.googleapis.com/download/storage/v1/b/kaggle-user-content/o/inbox%2F12421376%2Faa17e51fbcb74d5920dd0f5331f89668%2FMacBook%20Air%20-%201%20(3).png?generation=1690462300531653&alt=media) The dataset serves as a valuable resource for computer vision, anti-spoofing tasks, video analysis, and security systems. It allows for the development of algorithms and models that can effectively detect attacks perpetrated by individuals wearing printed 2D masks. Studying the dataset may lead to the development of improved security systems, surveillance technologies, and solutions to mitigate the risks associated with masked individuals carrying out attacks. # Get the dataset ### This is just an example of the data Leave a request on [**https://trainingdata.pro/data-market**](https://trainingdata.pro/data-market?utm_source=huggingface&utm_medium=cpc&utm_campaign=2d-masks-presentation-attack-detection) to discuss your requirements, learn about the price and buy the dataset. # Content ### The folder **"files"** includes 17 folders - corresponding to each person in the sample - containing of 12 videos of the individual ### File with the extension .csv - **user**: person in the videos, - **real_1,... real_4**: links to the videos with people without mask, - **mask_1,... mask_4**: links to the videos with 2D mask, - **cut_1,... cut_4**: links to the videos with 2D mask with cut-out eyes # Attacks might be collected in accordance with your requirements ## [**TrainingData**](https://trainingdata.pro/data-market?utm_source=huggingface&utm_medium=cpc&utm_campaign=2d-masks-presentation-attack-detection) provides high-quality data annotation tailored to your needs More datasets in TrainingData's Kaggle account: **<https://www.kaggle.com/trainingdatapro/datasets>** TrainingData's GitHub: **<https://github.com/Trainingdata-datamarket/TrainingData_All_datasets>**
CyberHarem/tethys_fireemblem
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of tethys (Fire Emblem) This is the dataset of tethys (Fire Emblem), containing 20 images and their tags. The core tags of this character are `braid, earrings, long_hair, red_hair, hoop_earrings, single_braid, red_eyes, breasts, facial_mark, bangs`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 20 | 17.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tethys_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 20 | 11.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tethys_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 36 | 19.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tethys_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 20 | 16.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tethys_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 36 | 27.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tethys_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/tethys_fireemblem', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 20 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, armlet, midriff, navel, dancer, looking_at_viewer, forehead_mark, smile, bare_shoulders, bracelet, simple_background, pants, white_background | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | armlet | midriff | navel | dancer | looking_at_viewer | forehead_mark | smile | bare_shoulders | bracelet | simple_background | pants | white_background | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------|:----------|:--------|:---------|:--------------------|:----------------|:--------|:-----------------|:-----------|:--------------------|:--------|:-------------------| | 0 | 20 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
SLTP/HLT-AA-C21-Alpaca
--- dataset_info: features: - name: instruction dtype: string - name: input dtype: string - name: output dtype: string splits: - name: train num_bytes: 5632866 num_examples: 6130 download_size: 1233010 dataset_size: 5632866 --- # Dataset Card for "HLT-AA-C21-Alpaca" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mnoukhov/openai_summarize_generated_20k_relabel_410m_dpo1
--- dataset_info: features: - name: prompt dtype: string - name: chosen dtype: string - name: rejected dtype: string splits: - name: train num_bytes: 35982323 num_examples: 20000 download_size: 21903259 dataset_size: 35982323 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "openai_summarize_generated_20k_relabelled" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
filopedraz/swedish-sentiment-instruction-fine-tuning
--- dataset_info: features: - name: input dtype: string - name: output dtype: string - name: instruction dtype: string splits: - name: train num_bytes: 54726179 num_examples: 163841 download_size: 24121083 dataset_size: 54726179 --- # Dataset Card for "swedish-sentiment-instruction-fine-tuning" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
ntnq/french_orca_dpo_pairs
--- license: apache-2.0 language: - fr size_categories: - 10K<n<100K tags: - rlhf - dpo --- This dataset offers a french translation of the 12k DPO [Intel/orca_dpo_pairs](https://huggingface.co/datasets/Intel/orca_dpo_pairs) pairs made from [Open-Orca/OpenOrca](https://huggingface.co/datasets/Open-Orca/OpenOrca).
open-source-metrics/huggingface_hub-dependents
--- dataset_info: features: - name: name dtype: 'null' - name: stars dtype: 'null' - name: forks dtype: 'null' splits: - name: package - name: repository download_size: 1798 dataset_size: 0 --- # Dataset Card for "huggingface_hub-dependents" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
adrienheymans/autotrain-data-csi5386
--- language: - en --- # AutoTrain Dataset for project: csi5386 ## Dataset Description This dataset has been automatically processed by AutoTrain for project csi5386. ### Languages The BCP-47 code for the dataset's language is en. ## Dataset Structure ### Data Instances A sample from this dataset looks as follows: ```json [ { "context": "Exhibit 10.1\n\nFORM OF SUB-RESELLER AGREEMENT\n\nSignature Page\n\nReseller Full Legal Name Salesforce.org, a nonprofit public benefit corporation having its principal place of business at 50 Fremont Street, Suite 300, San Francisco, California 94105\n\nThis Form of Sub-Reseller Agreement (this \"Sub-Reseller Agreement\") is made and entered in by and between salesforce.com, inc., a Delaware corporation having its principal place of business at The Landmark @ One Market, Suite 300, San Francisco, California 94105 (\"SFDC\" or \"Salesforce\") and the Reseller named above and amends that certain Reseller Agreement between Salesforce and Reseller dated as of August 1, 2015, as previously amended (the \"Agreement\"). This Sub-Reseller Agreement is effective as of the later of the dates beneath the Parties' signatures below (\"Sub-Reseller Effective Date\"), provided, however, that the dates of the Parties' signatures are not separated by a period of time greater than ten (10) business days. If such period is greater than ten (10) business days then this Sub-Reseller Agreement shall be deemed null and void and to be of no effect. Capitalized terms not defined herein shall have the meanings given to them in the Agreement.\n\nThe Parties, by their respective authorized signatories, have duly executed this Sub-Reseller Agreement as of the Sub-Reseller Effective Date.\n\nSalesforce.com, Inc. Reseller\n\nBy: By: Name: Name: Title: Title: Date: Date:\n\nSource: SALESFORCE.COM, INC., 10-Q, 11/22/2017\n\n\n\n\n\nExhibit 10.1\n\nSub-Reseller Agreement Terms & Conditions\n\n1. Resale Rights. SFDC hereby appoints SUB-RESELLER (\"Sub-Reseller\") as a sub-reseller to whom Reseller may resell Services in accordance with Section 2(ii) of the Agreement, provided that Sub-Reseller may only resell such Services to Customer. Reseller must ensure that Sub-Reseller complies with the terms of the Agreement applicable to Reseller as if Sub- Reseller were an original party to the Agreement and any breach by Sub-Reseller of the Agreement will be deemed a breach by Reseller. Sub-Reseller is not be a third-party beneficiary of the Agreement.\n\n2. Effect of Sub-Reseller Agreement. Subject to the above modifications, the Agreement remains in full force and effect.\n\n3. Entire Agreement. The terms and conditions herein contained constitute the entire agreement between the Parties with respect to the subject matter of this Sub-Reseller Agreement and supersede any previous and contemporaneous agreements and understandings, whether oral or written, between the Parties hereto with respect to the subject matter hereof.\n\n4. Counterparts. This Sub-Reseller Agreement may be executed in one or more counterparts, including facsimiles or scanned copies sent via email or otherwise, each of which will be deemed to be a duplicate original, but all of which, taken together, will be deemed to constitute a single instrument.\n\nSource: SALESFORCE.COM, INC., 10-Q, 11/22/2017", "question": "Highlight the parts (if any) of this contract related to \"Non-Disparagement\" that should be reviewed by a lawyer. Details: Is there a requirement on a party not to disparage the counterparty?", "answers.text": [ "" ], "answers.answer_start": [ -1 ], "feat_id": [ "SalesforcecomInc_20171122_10-Q_EX-10.1_10961535_EX-10.1_Reseller Agreement__Non-Disparagement_0" ], "feat_title": [ "SalesforcecomInc_20171122_10-Q_EX-10.1_10961535_EX-10.1_Reseller Agreement" ] }, { "context": "EXHIBIT 10.2\n\n DISTRIBUTOR AGREEMENT\n\nEXHIBIT 10.2\n\n EXCLUSIVE DISTRIBUTOR AGREEMENT\n\n THIS EXCLUSIVE DISTRIBUTOR AGREEMENT (the \"Agreement\") shall be effective as of _Dec. 8, 2005 (hereinafter \"Effective Date\"), by and between LifeUSA/ Envision Health, Inc., a corporation (hereinafter collectively \"ENVISION\"), and Sierra Mountain Minerals, Inc., a Canadian company (hereinafter \"SIERRA\"), is made with reference to the following facts:\n\n Recitals\n\nA. SIERRA is the manufacture and producer of a joint health product called \"SierraSil\" (hereinafter \"the Product\") for human use.\n\nB. ENVISION is the manufacturer of certain nutritional supplements and is desirous of becoming an exclusive distributor for the Product in any blend with Krill Oil (hereinafter \"the Finished Product\") in all distribution channels in the Territory on the terms and conditions set forth herein.\n\nC. SIERRA is desirous of having ENVISION act as its exclusive distributor for the Product in any blend with Krill Oil in all distribution channels in the Territory on the terms and conditions set forth herein.\n\nNOW, THEREFORE, it is hereby agreed as follows:\n\n1. Incorporation of Recitals. The Recitals set forth in Paragraphs A through C, above, are incorporated herein as though set forth in full.\n\n2. Appointment. SIERRA hereby appoints ENVISION as its exclusive distributor for the Product in any blend with Krill Oil within the Territory subject to ENVISION fulfilling the terms and conditions of the best efforts marketing requirements set forth herein in Sections 4, 5, and 9. SIERRA shall cease making sales to any customer or distributor who, during the term of this Agreement, violates ENVISION's exclusivity.\n\n3. Territory. The Territory shall be the entire world.\n\n4. Prices and Terms. The price for the Product as set forth in Section 9 herein, sold by SIERRA to ENVISION, shall be subject to change due to changes in manufacturing costs and so as to maximize profits; any changes in price for the Product shall not be applicable to previously accepted orders and shall be made with at least ninety (90) days advance notice in writing and in good faith by conference of the parties. ENVISION shall not resell the Product alone. Terms of payment will be 1/3 upon placement of order and 2/3 balance net thirty (30) days or as mutually agreed upon in writing between the parties. Delivery will be F.O.B. ENVISION shall be responsible for all costs of shipping from SIERRA to ENVISION.\n\n5. Product Support. ENVISION will use its best efforts to market and sell the Finished Product throughout the Territory. The parties also agree that:\n\n o If SIERRA customers are interested in purchasing the Product in any blend with Krill Oil, SIERRA will refer them to ENVISION.\n\n o ENVISION will be responsible for all costs associated with developing and manufacturing the Finished Product.\n\n6. Sales Disclosures. ENVISION will provide SIERRA with demand projections for the Product and SIERRA will produce enough Product to meet such demand projections. ENVISION will inform SIERRA of committed sales and SIERRA will increase or scale up its production of the Product accordingly. SIERRA will not unreasonably withhold the Product, but shall not be liable for unfulfilled or partially fulfilled orders given just cause for such action.\n\n7. Term. The term of this Agreement shall be two (2) years from the Effective Date with automatic annual renewals thereafter provided either party does not provide sixty (60) days notice of termination prior to the renewal date or the Agreement is not otherwise terminated as set forth in Section 8.\n\n8. Termination. (a) Upon the occurrence of a material breach or default as to any obligation, term or provision contained herein by either party and the failure of the breaching party to promptly pursue (within thirty (30) days after receiving written notice thereof from the non-breaching party) a reasonable remedy designed to cure (in the reasonable judgment of the non-breaching party) such material breach or default, this Agreement may be terminated by the non-breaching party by giving written notice of termination to the breaching party, such termination\n\n\n\n\n\n being immediately effective upon the giving of such notice of termination.\n\n (b) Upon the occurrence of bankruptcy of the other party, breach of confidentiality, government legislative interference, or force majeure extending beyond sixty (60) days, either party may immediately terminate the Agreement.\n\n9. Purchase Requirements. During the term of this Agreement, ENVISION will exclusively purchase the Product from SIERRA. The parties mutually agree to the Purchase Price of:\n\n Product Purchase Price ----------------------------------------------- A. SierraSil Per Sierra Sil's wholesale price list.\n\n10. Intellectual Property. SIERRA is responsible for all Patent costs for the Product. SIERRA warrants it owns pending patents for the Product in the U.S. and internationally. SIERRA hereby grants ENVISION an exclusive, royalty-free sub-license of the Product's future patents, and patent applications to distribute, sell and market the Finished Product. SIERRA hereby agrees to indemnify, defend and hold ENVISION harmless from any claims that the Product infringes upon any other patent.\n\n11. Trademarks SIERRA is the owner of the trademark&sbsp; \"SierraSil\". This Agreement grants ENVISION a non-exclusive and non-royalty bearing license to use the mark \"SierraSil\". SIERRA shall at all times be the owner of the trademark and ENVISION shall acquire no rights thereto. Upon termination, ENVISION shall have eighteen (18) months to exhaust any inventories, packaging and advertising materials bearing the \"SierraSil\" trademark and SIERRA shall have first option to buy back any inventory at ENVISION's net purchase price.\n\n12. Independent Contractor Status. The parties acknowledge that ENVISION is an independent contractor and shall not be deemed to be an employee, agent, or joint venturer of SIERRA for any purpose, including federal tax purposes.\n\n13. Warranty. SIERRA warrants that the Product shall be free from defects in material and workmanship for the reasonable shelf life of the Product. In the event of any breach of this warranty or in the event any user of Product makes a claim that the Product was the cause of personal injury or property damage (product liability claim), SIERRA shall indemnify, defend and hold ENVISION harmless from any liability occasioned by a breach of warranty or a product liability claim. SIERRA warrants that it carries general liability insurance of not less than $2 million per occurrence and product liability insurance of not less than $5 million per occurrence and that, upon the execution of this Agreement, it will name ENVISION as an additional insured on such policies. SIERRA further warrants that the Product will not be adulterated or misbranded within the meaning of any federal, state, or local law or regulation or other applicable law. SIERRA agrees to promptly notify ENVISION of any problem, anomaly, defect or condition which would reasonably cause ENVISION's concern relative to stability, reliability, form, fit, function or quality of the Product.\n\n ENVISION warrants that the Finished Product will not be adulterated or misbranded within the meaning of any federal, state, or local law or regulation or other applicable law. In the event of any breach of this warranty or in the event any user of the Finished Product makes a claim that the Finished Product was the cause of personal injury or property damage (product liability claim), ENVISION shall indemnify, defend, and hold SIERRA harmless from any liability occasioned by a breach of warranty or a product liability claim. ENVISION warrants that it carries general liability insurance of $1 million per occurrence and product liability insurance of not less than $2 million per occurrence and that, upon execution of this Agreement, it will name SIERRA as an additional insured on such policies.\n\n14. Confidential Information. The parties acknowledge that, during the term of this Agreement, each may receive certain Proprietary Information of the other. Proprietary Information includes, without limitation, formula, scientific studies, processes, plans, formulations, technical information, new product information, methods of product delivery, test procedures, product samples, specifications, scientific, clinical, commercial and other information or data, customer lists, customer contacts, and other distributors within the Territory which are considered confidential in nature whether communicated in writing or orally. The parties agree that each will treat such information as confidential. Neither party shall have the right to disclose the Proprietary Information to any third party without the express written consent of the disclosing party. Neither party may use the proprietary information except in furtherance of the goals of this Agreement and is further prohibited from utilizing the Proprietary Information directly nor indirectly to engage in any business activity which is competitive with the other.\n\n15. Force Majeure. In no event shall any party be responsible for its failure to fulfill any of its obligations under this Agreement when such failure is due to fires, floods, riots, strikes, freight embargoes, acts of God or insurrection. In the event of a force majeure, the party affected thereby shall give immediate written notice to the other. If the event of force majeure continues for longer than\n\n\n\n\n\n sixty (60) days, the party not so affected shall have the right to terminate this Agreement.\n\n16. Non-Waiver of Default. The failure of either party at any time to require the performance by a party of any provision of this Agreement shall in no way affect the right to require performance at any time after such failure. The waiver of either party of a breach of any provision of this Agreement shall not be taken to be a waiver of any succeeding breach of the provision or as a waiver of the provision itself.\n\n17. Attorney's Fees. In the event either party is required to institute litigation to enforce any provision of this Agreement, the prevailing party in such litigation shall be entitled to recover all costs including without limitation, reasonable attorney's fees and expenses incurred in connection with such enforcement and collection.\n\n18. Venue. This Agreement is deemed to have been entered into in the State of Colorado, and its interpretation, construction, and the remedies for its enforcement or breach are to be applied pursuant to and in accordance with the laws of the State of Colorado.\n\n19. Notices. Any and all notices or other communication required or permitted to be given pursuant to this Agreement shall be in writing and shall be construed as properly given if mailed first class, postage prepaid to the address specified herein. Either party may designate, in writing, a change of address or other place to which notices may be sent.\n\n If to SIERRA: If to LIFEUSA/ENVISION: Mr. Michael Bentley Mr. Michael Schuett Sierra Mountain Minerals Inc. Envision Health, Inc. 1501 West Broadway, Suite 500 2475 Broadway, Suite 202 Vancouver BC V6J4Z6 Boulder, CO 80304 Canada\n\n20. Amendment. This Agreement shall not be modified or amended except by a written agreement executed by both parties.\n\n21. Entire Agreement. This Agreement constitutes the entire agreement between the parties with respect to the subject matter thereof and supersedes all prior agreements, whether written or oral.\n\n22. Assignment. The parties shall have the right to assign all, or part, of its rights under this Agreement to any wholly owned subsidiary or affiliate without the consent of the other Party. Any other assignment by the parties, requires the prior written consent of the other Party.\n\nACKNOWLEDGEMENTS\n\n Each party acknowledges that he or she has had an adequate opportunity to read and study this Agreement. The understanding of the aforesaid articles causes no difficulty whatsoever and each party has retained a copy of this agreement immediately after the signing of it by all parties.\n\n IN WITNESS WHEREOF, the parties have executed this Agreement effective as of the date and year first written above.\n\nSIERRA MOUNTAIN MINERALS LIFEUSA/ENVISION HEALTH\n\nBy: /s/ Michael Bentley By: /s/ Michael Schuett ----------------------- ------------------------- Michael Bentley Michael Schuett\n\n December 8, 2005 December 7, 2005 ----------------------- ------------------------------ Date Date", "question": "Highlight the parts (if any) of this contract related to \"Third Party Beneficiary\" that should be reviewed by a lawyer. Details: Is there a non-contracting party who is a beneficiary to some or all of the clauses in the contract and therefore can enforce its rights against a contracting party?", "answers.text": [ "" ], "answers.answer_start": [ -1 ], "feat_id": [ "LEGACYTECHNOLOGYHOLDINGS,INC_12_09_2005-EX-10.2-DISTRIBUTOR AGREEMENT__Third Party Beneficiary_0" ], "feat_title": [ "LEGACYTECHNOLOGYHOLDINGS,INC_12_09_2005-EX-10.2-DISTRIBUTOR AGREEMENT" ] } ] ``` ### Dataset Fields The dataset has the following fields (also called "features"): ```json { "context": "Value(dtype='string', id=None)", "question": "Value(dtype='string', id=None)", "answers.text": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)", "answers.answer_start": "Sequence(feature=Value(dtype='int32', id=None), length=-1, id=None)", "feat_id": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)", "feat_title": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)" } ``` ### Dataset Splits This dataset is split into a train and validation split. The split sizes are as follow: | Split name | Num samples | | ------------ | ------------------- | | train | 16687 | | valid | 4182 |
vphu123/llm_data_2
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 21132 num_examples: 26 download_size: 14171 dataset_size: 21132 --- # Dataset Card for "llm_data_2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
keremberke/construction-safety-object-detection
--- task_categories: - object-detection tags: - roboflow - roboflow2huggingface - Construction - Logistics - Utilities - Damage Risk - Ppe - Construction - Utilities - Manufacturing - Logistics - Ppe - Assembly Line - Warehouse - Factory --- <div align="center"> <img width="640" alt="keremberke/construction-safety-object-detection" src="https://huggingface.co/datasets/keremberke/construction-safety-object-detection/resolve/main/thumbnail.jpg"> </div> ### Dataset Labels ``` ['barricade', 'dumpster', 'excavators', 'gloves', 'hardhat', 'mask', 'no-hardhat', 'no-mask', 'no-safety vest', 'person', 'safety net', 'safety shoes', 'safety vest', 'dump truck', 'mini-van', 'truck', 'wheel loader'] ``` ### Number of Images ```json {'train': 307, 'valid': 57, 'test': 34} ``` ### How to Use - Install [datasets](https://pypi.org/project/datasets/): ```bash pip install datasets ``` - Load the dataset: ```python from datasets import load_dataset ds = load_dataset("keremberke/construction-safety-object-detection", name="full") example = ds['train'][0] ``` ### Roboflow Dataset Page [https://universe.roboflow.com/roboflow-universe-projects/construction-site-safety/dataset/1](https://universe.roboflow.com/roboflow-universe-projects/construction-site-safety/dataset/1?ref=roboflow2huggingface) ### Citation ``` @misc{ construction-site-safety_dataset, title = { Construction Site Safety Dataset }, type = { Open Source Dataset }, author = { Roboflow Universe Projects }, howpublished = { \\url{ https://universe.roboflow.com/roboflow-universe-projects/construction-site-safety } }, url = { https://universe.roboflow.com/roboflow-universe-projects/construction-site-safety }, journal = { Roboflow Universe }, publisher = { Roboflow }, year = { 2023 }, month = { jan }, note = { visited on 2023-01-26 }, } ``` ### License CC BY 4.0 ### Dataset Summary This dataset was exported via roboflow.com on December 29, 2022 at 11:22 AM GMT Roboflow is an end-to-end computer vision platform that helps you * collaborate with your team on computer vision projects * collect & organize images * understand unstructured image data * annotate, and create datasets * export, train, and deploy computer vision models * use active learning to improve your dataset over time It includes 398 images. Construction are annotated in COCO format. The following pre-processing was applied to each image: * Auto-orientation of pixel data (with EXIF-orientation stripping) No image augmentation techniques were applied.
hugfaceguy0001/ClassicNovels
--- dataset_info: - config_name: info features: - name: name dtype: string - name: catalogues sequence: string - name: catalogueTotal dtype: int64 - name: bookType dtype: string - name: words dtype: int64 - name: author struct: - name: dynasty dtype: string - name: intro dtype: string - name: name dtype: string - name: intro dtype: string splits: - name: train num_bytes: 1568384 num_examples: 441 download_size: 1041411 dataset_size: 1568384 - config_name: text features: - name: book_name dtype: string - name: author dtype: string - name: category dtype: string - name: chapter_id dtype: int64 - name: chapter_name dtype: string - name: N_chapters dtype: int64 - name: text dtype: string splits: - name: train num_bytes: 259338573 num_examples: 20617 download_size: 178111852 dataset_size: 259338573 configs: - config_name: info data_files: - split: train path: info/train-* - config_name: text data_files: - split: train path: text/train-* ---
davanstrien/test_imdb_embedd
--- annotations_creators: - expert-generated language_creators: - expert-generated language: - en license: - other multilinguality: - monolingual size_categories: - 10K<n<100K source_datasets: imdb task_categories: - text-classification task_ids: - sentiment-classification paperswithcode_id: imdb-movie-reviews pretty_name: IMDB dataset_info: features: - name: text dtype: string - name: label dtype: class_label: names: 0: neg 1: pos config_name: plain_text splits: - name: train num_bytes: 33432835 num_examples: 25000 - name: test num_bytes: 32650697 num_examples: 25000 - name: unsupervised num_bytes: 67106814 num_examples: 50000 download_size: 84125825 dataset_size: 133190346 tags: - embeddings train-eval-index: - config: plain_text task: text-classification task_id: binary_classification splits: train_split: train eval_split: test col_mapping: text: text label: target metrics: - type: accuracy - name: Accuracy - type: f1 name: F1 macro args: average: macro - type: f1 name: F1 micro args: average: micro - type: f1 name: F1 weighted args: average: weighted - type: precision name: Precision macro args: average: macro - type: precision name: Precision micro args: average: micro - type: precision name: Precision weighted args: average: weighted - type: recall name: Recall macro args: average: macro - type: recall name: Recall micro args: average: micro - type: recall name: Recall weighted args: average: weighted --- # Dataset Card for "test_imdb_embedd" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
aseel-kh/aratts
--- license: mpl-2.0 ---
todi1/pasmr
--- license: openrail ---
someguitar/MGdata
--- license: mit ---
voice-is-cool/voxtube
--- dataset_info: homepage: https://idrnd.github.io/VoxTube/ description: VoxTube - a multilingual speaker recognition dataset license: CC-BY-NC-SA-4.0 citation: "@inproceedings{yakovlev23_interspeech, author={Ivan Yakovlev and Anton Okhotnikov and Nikita Torgashov and Rostislav Makarov and Yuri Voevodin and Konstantin Simonchik}, title={{VoxTube: a multilingual speaker recognition dataset}}, year=2023, booktitle={Proc. INTERSPEECH 2023}, pages={2238--2242}, doi={10.21437/Interspeech.2023-1083} }" features: - name: upload_date dtype: date32 - name: segment_id dtype: int32 - name: video_id dtype: string - name: channel_id dtype: string - name: language dtype: string - name: gender dtype: string - name: spk_id dtype: int32 - name: spk_estim_age dtype: float32 - name: spk_estim_age_mae dtype: float32 - name: audio dtype: audio: sampling_rate: 16000 splits: - name: train num_bytes: 222149986832.446 num_examples: 4459754 download_size: 220167447157 dataset_size: 222149986832.446 configs: - config_name: default data_files: - split: train path: data/train-* license: cc-by-nc-sa-4.0 task_categories: - audio-classification language: - en - ru - es - pt - fr - ar - it - de - tr - nl - ko pretty_name: VoxTube size_categories: - 1M<n<10M extra_gated_fields: Name: text Affiliation: text Email: text I understand the applicability and accept the limitations of CC-BY-NC-SA license of this dataset that NO commercial usage is allowed: checkbox By clicking on "Access repository" below, I agree to not attempt to determine the identity of speakers in the dataset: checkbox --- # The VoxTube Dataset The [VoxTube](https://idrnd.github.io/VoxTube) is a multilingual speaker recognition dataset collected from the **CC BY 4.0** YouTube videos. It includes 5.040 speaker identities pronouncing ~4M utterances in 10+ languages. For the underlying data collection and filtering approach details please refer to [[1]](#citation). ## Dataset Structure ### Data Instances A typical data point comprises the audio signal iself, with additional labels like speaker id / session id (*video_id*) / language / gender etc. ``` {'upload_date': datetime.date(2018, 5, 2), 'segment_id': 11, 'video_id': 'vIpK78CL1so', 'channel_id': 'UC7rMVNUr7318I0MKumPbIKA', 'language': 'english', 'gender': 'male', 'spk_id': 684, 'spk_estim_age': 23.5572452545166, 'spk_estim_age_mae': 3.6162896156311035, 'audio': {'path': 'UC7rMVNUr7318I0MKumPbIKA/vIpK78CL1so/segment_11.mp3', 'array': array([-0.00986903, -0.01569703, -0.02005875, ..., -0.00247505, -0.01329966, -0.01462782]), 'sampling_rate': 16000}} ``` ### Data Fields - **channel_id**: YouTube channel ID from which speaker ID (`spk_id`) is derived. - **video_id**: YouTube video ID, or session for speaker. - **segment_id**: ID of chunk of video's audio, that passed filtration process. - **upload_date**: Date time object representing the date when video was uploaded to YouTube. - **language**: Language of the channel / speaker. - **gender**: Gender of the channel / speaker. - **spk_id**: Infered integer speaker ID from **channel_id**. - **spk_estim_age**: Label of speaker age (not accurate) based on voice-based automatic age estimation & calibrated based on the upload_date of all videos for a given channel. - **spk_estim_age_mae**: MAE of **spk_estim_age** (might be considered as confidence). - **audio**: audio signal of a 4 seconds *mp3* segment from **channel_id/video_id** ## Dataset description ### Main statistics | Dataset properties | Stats | |:-----------------------------|:----------| | # of POI | 5.040 | | # of videos | 306.248 | | # of segments | 4.439.888 | | # of hours | 4.933 | | Avg # of videos per POI | 61 | | Avg # of segments per POI | 881 | | Avg length of segments (sec) | 4 | ### Language and gender distributions ![Distributions](./lang_gender.jpeg) Language and gender labels of each speaker are available in original repo [here](https://github.com/IDRnD/VoxTube/blob/main/resources/language_gender_meta.csv). ## License The dataset is licensed under **CC BY-NC-SA 4.0**, please see the complete version of the [license](LICENSE). Please also note that the provided metadata is relevant on the February 2023 and the corresponding CC BY 4.0 video licenses are valid on that date. ID R&D Inc. is not responsible for changed video license type or if the video was deleted from the YouTube platform. If you want your channel meta to be deleted from the dataset, please [contact ID R&D Inc.](https://www.idrnd.ai/contact-us) with a topic *"VoxTube change request"*. ## Development Official repository [live repository](https://github.com/IDRnD/VoxTube) for opening issues. ## Citation Please cite the paper below if you make use of the dataset: ``` @inproceedings{yakovlev23_interspeech, author={Ivan Yakovlev and Anton Okhotnikov and Nikita Torgashov and Rostislav Makarov and Yuri Voevodin and Konstantin Simonchik}, title={{VoxTube: a multilingual speaker recognition dataset}}, year=2023, booktitle={Proc. INTERSPEECH 2023}, pages={2238--2242}, doi={10.21437/Interspeech.2023-1083} } ```````
TheGreatP/vozSisso
--- license: openrail ---
Quangnguyen711/clothes_shop_consultant
--- license: apache-2.0 task_categories: - question-answering language: - en tags: - finance size_categories: - n<1K ---
EnergyStarAI/text_generation
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 1707427 num_examples: 1000 download_size: 1073650 dataset_size: 1707427 configs: - config_name: default data_files: - split: train path: data/train-* ---
suolyer/pile_books3
--- license: apache-2.0 ---
open-llm-leaderboard/details_google__gemma-1.1-7b-it
--- pretty_name: Evaluation run of google/gemma-1.1-7b-it dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [google/gemma-1.1-7b-it](https://huggingface.co/google/gemma-1.1-7b-it) on the\ \ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_google__gemma-1.1-7b-it\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-04-15T12:58:48.766553](https://huggingface.co/datasets/open-llm-leaderboard/details_google__gemma-1.1-7b-it/blob/main/results_2024-04-15T12-58-48.766553.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.606678798665646,\n\ \ \"acc_stderr\": 0.03317128732740283,\n \"acc_norm\": 0.6116007890085364,\n\ \ \"acc_norm_stderr\": 0.03383193418218571,\n \"mc1\": 0.3427172582619339,\n\ \ \"mc1_stderr\": 0.01661494938534704,\n \"mc2\": 0.5074425172130677,\n\ \ \"mc2_stderr\": 0.0164219457298532\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5708191126279863,\n \"acc_stderr\": 0.014464085894870653,\n\ \ \"acc_norm\": 0.6006825938566553,\n \"acc_norm_stderr\": 0.014312094557946705\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.58743278231428,\n \ \ \"acc_stderr\": 0.004912900450370839,\n \"acc_norm\": 0.7614021111332404,\n\ \ \"acc_norm_stderr\": 0.0042535530447077715\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\ \ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\ \ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\ \ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\ \ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \ \ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6377358490566037,\n \"acc_stderr\": 0.029582245128384303,\n\ \ \"acc_norm\": 0.6377358490566037,\n \"acc_norm_stderr\": 0.029582245128384303\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n\ \ \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n\ \ \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \ \ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\ : 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\ \ \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.6011560693641619,\n\ \ \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n\ \ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\ \ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\ \ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\ \ \"acc_stderr\": 0.04657047260594961,\n \"acc_norm\": 0.4298245614035088,\n\ \ \"acc_norm_stderr\": 0.04657047260594961\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\ \ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.4656084656084656,\n \"acc_stderr\": 0.025690321762493838,\n \"\ acc_norm\": 0.4656084656084656,\n \"acc_norm_stderr\": 0.025690321762493838\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\ \ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\ \ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.7451612903225806,\n \"acc_stderr\": 0.024790118459332208,\n \"\ acc_norm\": 0.7451612903225806,\n \"acc_norm_stderr\": 0.024790118459332208\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.5517241379310345,\n \"acc_stderr\": 0.034991131376767445,\n \"\ acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.034991131376767445\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\ : 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091706,\n\ \ \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091706\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932046,\n \"\ acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932046\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153314,\n\ \ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153314\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.024603626924097417,\n\ \ \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.024603626924097417\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.4185185185185185,\n \"acc_stderr\": 0.03007801307502206,\n \ \ \"acc_norm\": 0.4185185185185185,\n \"acc_norm_stderr\": 0.03007801307502206\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297792,\n \ \ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297792\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\ acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8110091743119267,\n \"acc_stderr\": 0.016785481159203627,\n \"\ acc_norm\": 0.8110091743119267,\n \"acc_norm_stderr\": 0.016785481159203627\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"\ acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591362,\n \"\ acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591362\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \ \ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n\ \ \"acc_stderr\": 0.030500283176545854,\n \"acc_norm\": 0.7085201793721974,\n\ \ \"acc_norm_stderr\": 0.030500283176545854\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677697,\n\ \ \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677697\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"\ acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\ \ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \ \ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n\ \ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\ \ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\ \ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\ \ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\ \ \"acc_stderr\": 0.02363687331748929,\n \"acc_norm\": 0.8461538461538461,\n\ \ \"acc_norm_stderr\": 0.02363687331748929\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \ \ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7675606641123882,\n\ \ \"acc_stderr\": 0.015104550008905713,\n \"acc_norm\": 0.7675606641123882,\n\ \ \"acc_norm_stderr\": 0.015104550008905713\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.025816756791584204,\n\ \ \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.025816756791584204\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23128491620111732,\n\ \ \"acc_stderr\": 0.014102223623152573,\n \"acc_norm\": 0.23128491620111732,\n\ \ \"acc_norm_stderr\": 0.014102223623152573\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.02664327847450875,\n\ \ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.02664327847450875\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n\ \ \"acc_stderr\": 0.026981478043648043,\n \"acc_norm\": 0.6559485530546624,\n\ \ \"acc_norm_stderr\": 0.026981478043648043\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6358024691358025,\n \"acc_stderr\": 0.02677492989972234,\n\ \ \"acc_norm\": 0.6358024691358025,\n \"acc_norm_stderr\": 0.02677492989972234\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \ \ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44198174706649285,\n\ \ \"acc_stderr\": 0.01268397251359881,\n \"acc_norm\": 0.44198174706649285,\n\ \ \"acc_norm_stderr\": 0.01268397251359881\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n\ \ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5996732026143791,\n \"acc_stderr\": 0.01982184368827176,\n \ \ \"acc_norm\": 0.5996732026143791,\n \"acc_norm_stderr\": 0.01982184368827176\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\ \ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\ \ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712844,\n\ \ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712844\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\ \ \"acc_stderr\": 0.027403859410786845,\n \"acc_norm\": 0.8159203980099502,\n\ \ \"acc_norm_stderr\": 0.027403859410786845\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \ \ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\ \ \"acc_stderr\": 0.03892212195333047,\n \"acc_norm\": 0.5060240963855421,\n\ \ \"acc_norm_stderr\": 0.03892212195333047\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03126781714663179,\n\ \ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03126781714663179\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3427172582619339,\n\ \ \"mc1_stderr\": 0.01661494938534704,\n \"mc2\": 0.5074425172130677,\n\ \ \"mc2_stderr\": 0.0164219457298532\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.696921862667719,\n \"acc_stderr\": 0.01291672746263446\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.42987111448066717,\n \ \ \"acc_stderr\": 0.013636344017393736\n }\n}\n```" repo_url: https://huggingface.co/google/gemma-1.1-7b-it leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|arc:challenge|25_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-04-15T12-58-48.766553.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|gsm8k|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hellaswag|10_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-management|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-management|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-04-15T12-58-48.766553.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-international_law|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-management|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-marketing|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-sociology|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-virology|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T12-58-48.766553.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|truthfulqa:mc|0_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-04-15T12-58-48.766553.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_04_15T12_58_48.766553 path: - '**/details_harness|winogrande|5_2024-04-15T12-58-48.766553.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-04-15T12-58-48.766553.parquet' - config_name: results data_files: - split: 2024_04_15T12_58_48.766553 path: - results_2024-04-15T12-58-48.766553.parquet - split: latest path: - results_2024-04-15T12-58-48.766553.parquet --- # Dataset Card for Evaluation run of google/gemma-1.1-7b-it <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [google/gemma-1.1-7b-it](https://huggingface.co/google/gemma-1.1-7b-it) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_google__gemma-1.1-7b-it", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-04-15T12:58:48.766553](https://huggingface.co/datasets/open-llm-leaderboard/details_google__gemma-1.1-7b-it/blob/main/results_2024-04-15T12-58-48.766553.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.606678798665646, "acc_stderr": 0.03317128732740283, "acc_norm": 0.6116007890085364, "acc_norm_stderr": 0.03383193418218571, "mc1": 0.3427172582619339, "mc1_stderr": 0.01661494938534704, "mc2": 0.5074425172130677, "mc2_stderr": 0.0164219457298532 }, "harness|arc:challenge|25": { "acc": 0.5708191126279863, "acc_stderr": 0.014464085894870653, "acc_norm": 0.6006825938566553, "acc_norm_stderr": 0.014312094557946705 }, "harness|hellaswag|10": { "acc": 0.58743278231428, "acc_stderr": 0.004912900450370839, "acc_norm": 0.7614021111332404, "acc_norm_stderr": 0.0042535530447077715 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.046482319871173156, "acc_norm": 0.31, "acc_norm_stderr": 0.046482319871173156 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5185185185185185, "acc_stderr": 0.043163785995113245, "acc_norm": 0.5185185185185185, "acc_norm_stderr": 0.043163785995113245 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6776315789473685, "acc_stderr": 0.03803510248351585, "acc_norm": 0.6776315789473685, "acc_norm_stderr": 0.03803510248351585 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.6, "acc_stderr": 0.04923659639173309, "acc_norm": 0.6, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6377358490566037, "acc_stderr": 0.029582245128384303, "acc_norm": 0.6377358490566037, "acc_norm_stderr": 0.029582245128384303 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6805555555555556, "acc_stderr": 0.038990736873573344, "acc_norm": 0.6805555555555556, "acc_norm_stderr": 0.038990736873573344 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6011560693641619, "acc_stderr": 0.037336266553835096, "acc_norm": 0.6011560693641619, "acc_norm_stderr": 0.037336266553835096 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3627450980392157, "acc_stderr": 0.04784060704105654, "acc_norm": 0.3627450980392157, "acc_norm_stderr": 0.04784060704105654 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5659574468085107, "acc_stderr": 0.03240038086792747, "acc_norm": 0.5659574468085107, "acc_norm_stderr": 0.03240038086792747 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4298245614035088, "acc_stderr": 0.04657047260594961, "acc_norm": 0.4298245614035088, "acc_norm_stderr": 0.04657047260594961 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.593103448275862, "acc_stderr": 0.04093793981266236, "acc_norm": 0.593103448275862, "acc_norm_stderr": 0.04093793981266236 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4656084656084656, "acc_stderr": 0.025690321762493838, "acc_norm": 0.4656084656084656, "acc_norm_stderr": 0.025690321762493838 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04426266681379909, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04426266681379909 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7451612903225806, "acc_stderr": 0.024790118459332208, "acc_norm": 0.7451612903225806, "acc_norm_stderr": 0.024790118459332208 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5517241379310345, "acc_stderr": 0.034991131376767445, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.034991131376767445 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.68, "acc_stderr": 0.04688261722621504, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.696969696969697, "acc_stderr": 0.03588624800091706, "acc_norm": 0.696969696969697, "acc_norm_stderr": 0.03588624800091706 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7575757575757576, "acc_stderr": 0.030532892233932046, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.030532892233932046 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.844559585492228, "acc_stderr": 0.026148483469153314, "acc_norm": 0.844559585492228, "acc_norm_stderr": 0.026148483469153314 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6205128205128205, "acc_stderr": 0.024603626924097417, "acc_norm": 0.6205128205128205, "acc_norm_stderr": 0.024603626924097417 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.4185185185185185, "acc_stderr": 0.03007801307502206, "acc_norm": 0.4185185185185185, "acc_norm_stderr": 0.03007801307502206 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6890756302521008, "acc_stderr": 0.03006676158297792, "acc_norm": 0.6890756302521008, "acc_norm_stderr": 0.03006676158297792 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.32450331125827814, "acc_stderr": 0.03822746937658752, "acc_norm": 0.32450331125827814, "acc_norm_stderr": 0.03822746937658752 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8110091743119267, "acc_stderr": 0.016785481159203627, "acc_norm": 0.8110091743119267, "acc_norm_stderr": 0.016785481159203627 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4583333333333333, "acc_stderr": 0.03398110890294636, "acc_norm": 0.4583333333333333, "acc_norm_stderr": 0.03398110890294636 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7598039215686274, "acc_stderr": 0.02998373305591362, "acc_norm": 0.7598039215686274, "acc_norm_stderr": 0.02998373305591362 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7510548523206751, "acc_stderr": 0.028146970599422644, "acc_norm": 0.7510548523206751, "acc_norm_stderr": 0.028146970599422644 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7085201793721974, "acc_stderr": 0.030500283176545854, "acc_norm": 0.7085201793721974, "acc_norm_stderr": 0.030500283176545854 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6870229007633588, "acc_stderr": 0.04066962905677697, "acc_norm": 0.6870229007633588, "acc_norm_stderr": 0.04066962905677697 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8016528925619835, "acc_stderr": 0.03640118271990946, "acc_norm": 0.8016528925619835, "acc_norm_stderr": 0.03640118271990946 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.75, "acc_stderr": 0.04186091791394607, "acc_norm": 0.75, "acc_norm_stderr": 0.04186091791394607 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7484662576687117, "acc_stderr": 0.034089978868575295, "acc_norm": 0.7484662576687117, "acc_norm_stderr": 0.034089978868575295 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.48214285714285715, "acc_stderr": 0.047427623612430116, "acc_norm": 0.48214285714285715, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8461538461538461, "acc_stderr": 0.02363687331748929, "acc_norm": 0.8461538461538461, "acc_norm_stderr": 0.02363687331748929 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7675606641123882, "acc_stderr": 0.015104550008905713, "acc_norm": 0.7675606641123882, "acc_norm_stderr": 0.015104550008905713 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6416184971098265, "acc_stderr": 0.025816756791584204, "acc_norm": 0.6416184971098265, "acc_norm_stderr": 0.025816756791584204 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23128491620111732, "acc_stderr": 0.014102223623152573, "acc_norm": 0.23128491620111732, "acc_norm_stderr": 0.014102223623152573 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6830065359477124, "acc_stderr": 0.02664327847450875, "acc_norm": 0.6830065359477124, "acc_norm_stderr": 0.02664327847450875 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6559485530546624, "acc_stderr": 0.026981478043648043, "acc_norm": 0.6559485530546624, "acc_norm_stderr": 0.026981478043648043 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6358024691358025, "acc_stderr": 0.02677492989972234, "acc_norm": 0.6358024691358025, "acc_norm_stderr": 0.02677492989972234 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4716312056737589, "acc_stderr": 0.029779450957303062, "acc_norm": 0.4716312056737589, "acc_norm_stderr": 0.029779450957303062 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.44198174706649285, "acc_stderr": 0.01268397251359881, "acc_norm": 0.44198174706649285, "acc_norm_stderr": 0.01268397251359881 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5183823529411765, "acc_stderr": 0.030352303395351964, "acc_norm": 0.5183823529411765, "acc_norm_stderr": 0.030352303395351964 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5996732026143791, "acc_stderr": 0.01982184368827176, "acc_norm": 0.5996732026143791, "acc_norm_stderr": 0.01982184368827176 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7363636363636363, "acc_stderr": 0.04220224692971987, "acc_norm": 0.7363636363636363, "acc_norm_stderr": 0.04220224692971987 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.726530612244898, "acc_stderr": 0.02853556033712844, "acc_norm": 0.726530612244898, "acc_norm_stderr": 0.02853556033712844 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8159203980099502, "acc_stderr": 0.027403859410786845, "acc_norm": 0.8159203980099502, "acc_norm_stderr": 0.027403859410786845 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-virology|5": { "acc": 0.5060240963855421, "acc_stderr": 0.03892212195333047, "acc_norm": 0.5060240963855421, "acc_norm_stderr": 0.03892212195333047 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7894736842105263, "acc_stderr": 0.03126781714663179, "acc_norm": 0.7894736842105263, "acc_norm_stderr": 0.03126781714663179 }, "harness|truthfulqa:mc|0": { "mc1": 0.3427172582619339, "mc1_stderr": 0.01661494938534704, "mc2": 0.5074425172130677, "mc2_stderr": 0.0164219457298532 }, "harness|winogrande|5": { "acc": 0.696921862667719, "acc_stderr": 0.01291672746263446 }, "harness|gsm8k|5": { "acc": 0.42987111448066717, "acc_stderr": 0.013636344017393736 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
AiHevenpen/file
--- license: mit ---
malucoelhaofc/DoutorPimpolhoV2
--- license: openrail ---
sakib131/pastis-segmentation
--- dataset_info: features: - name: image dtype: image - name: label dtype: image splits: - name: train num_bytes: 10907117.747214139 num_examples: 2189 - name: test num_bytes: 604176.2683929305 num_examples: 122 - name: valid num_bytes: 601786.2683929305 num_examples: 122 download_size: 10294182 dataset_size: 12113080.283999998 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* - split: valid path: data/valid-* ---
Azure99/blossom-orca-v1
--- license: apache-2.0 task_categories: - text-generation - text2text-generation language: - zh - en size_categories: - 100K<n<1M --- # BLOSSOM ORCA V1 ### 介绍 [Blossom Orca V2](https://huggingface.co/datasets/Azure99/blossom-orca-v2)版本已发布!🤗 Blossom Orca V1是一个基于OpenOrca衍生而来的中英双语指令数据集,适用于指令微调。 本数据集从OpenOrca中抽取了系统提示和指令,首先将其翻译为中文并校验翻译结果,再使用指令调用gpt-3.5-turbo-0613模型生成响应,并过滤掉包含自我认知以及拒绝回答的响应,以便后续对齐。此外,为了确保响应风格的一致性以及中英数据配比,本数据集还对未翻译的原始指令也进行了相同的调用,最终得到了1:1的中英双语指令数据。 相比直接对原始OpenOrca进行翻译的中文数据集,Blossom Orca的一致性及质量更高。 本次发布了全量数据的30%,包含中英双语各100K,共计200K记录。 ### 语言 以中文和英文为主。 ### 数据集结构 数据集包含两个文件:blossom-orca-v1-chinese-100k.json和blossom-orca-v1-english-100k.json,分别对应中文和英文的数据。 每条数据代表一个完整的对话,包含id和conversations两个字段。 - id:字符串,代表原始OpenOrca的指令id。 - conversations:对象数组,每个对象包含role、content两个字段,role的取值为system、user或assistant,分别代表系统提示、用户输入和助手输出,content则为对应的内容。 ### 数据集限制 本数据集的所有响应均由gpt-3.5-turbo-0613生成,并未经过严格的数据校验,可能包含不准确甚至严重错误的回答。此外,由于过滤了拒答响应,仅使用本数据集训练的模型,可能不会拒绝非法的请求。
BangumiBase/datealive
--- license: mit tags: - art size_categories: - 1K<n<10K --- # Bangumi Image Base of Date A Live This is the image base of bangumi DATE A LIVE, we detected 92 characters, 9273 images in total. The full dataset is [here](all.zip). **Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability). Here is the characters' preview: | # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 | |:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------| | 0 | 348 | [Download](0/dataset.zip) | ![preview 1](0/preview_1.png) | ![preview 2](0/preview_2.png) | ![preview 3](0/preview_3.png) | ![preview 4](0/preview_4.png) | ![preview 5](0/preview_5.png) | ![preview 6](0/preview_6.png) | ![preview 7](0/preview_7.png) | ![preview 8](0/preview_8.png) | | 1 | 105 | [Download](1/dataset.zip) | ![preview 1](1/preview_1.png) | ![preview 2](1/preview_2.png) | ![preview 3](1/preview_3.png) | ![preview 4](1/preview_4.png) | ![preview 5](1/preview_5.png) | ![preview 6](1/preview_6.png) | ![preview 7](1/preview_7.png) | ![preview 8](1/preview_8.png) | | 2 | 67 | [Download](2/dataset.zip) | ![preview 1](2/preview_1.png) | ![preview 2](2/preview_2.png) | ![preview 3](2/preview_3.png) | ![preview 4](2/preview_4.png) | ![preview 5](2/preview_5.png) | ![preview 6](2/preview_6.png) | ![preview 7](2/preview_7.png) | ![preview 8](2/preview_8.png) | | 3 | 84 | [Download](3/dataset.zip) | ![preview 1](3/preview_1.png) | ![preview 2](3/preview_2.png) | ![preview 3](3/preview_3.png) | ![preview 4](3/preview_4.png) | ![preview 5](3/preview_5.png) | ![preview 6](3/preview_6.png) | ![preview 7](3/preview_7.png) | ![preview 8](3/preview_8.png) | | 4 | 42 | [Download](4/dataset.zip) | ![preview 1](4/preview_1.png) | ![preview 2](4/preview_2.png) | ![preview 3](4/preview_3.png) | ![preview 4](4/preview_4.png) | ![preview 5](4/preview_5.png) | ![preview 6](4/preview_6.png) | ![preview 7](4/preview_7.png) | ![preview 8](4/preview_8.png) | | 5 | 2476 | [Download](5/dataset.zip) | ![preview 1](5/preview_1.png) | ![preview 2](5/preview_2.png) | ![preview 3](5/preview_3.png) | ![preview 4](5/preview_4.png) | ![preview 5](5/preview_5.png) | ![preview 6](5/preview_6.png) | ![preview 7](5/preview_7.png) | ![preview 8](5/preview_8.png) | | 6 | 108 | [Download](6/dataset.zip) | ![preview 1](6/preview_1.png) | ![preview 2](6/preview_2.png) | ![preview 3](6/preview_3.png) | ![preview 4](6/preview_4.png) | ![preview 5](6/preview_5.png) | ![preview 6](6/preview_6.png) | ![preview 7](6/preview_7.png) | ![preview 8](6/preview_8.png) | | 7 | 82 | [Download](7/dataset.zip) | ![preview 1](7/preview_1.png) | ![preview 2](7/preview_2.png) | ![preview 3](7/preview_3.png) | ![preview 4](7/preview_4.png) | ![preview 5](7/preview_5.png) | ![preview 6](7/preview_6.png) | ![preview 7](7/preview_7.png) | ![preview 8](7/preview_8.png) | | 8 | 63 | [Download](8/dataset.zip) | ![preview 1](8/preview_1.png) | ![preview 2](8/preview_2.png) | ![preview 3](8/preview_3.png) | ![preview 4](8/preview_4.png) | ![preview 5](8/preview_5.png) | ![preview 6](8/preview_6.png) | ![preview 7](8/preview_7.png) | ![preview 8](8/preview_8.png) | | 9 | 33 | [Download](9/dataset.zip) | ![preview 1](9/preview_1.png) | ![preview 2](9/preview_2.png) | ![preview 3](9/preview_3.png) | ![preview 4](9/preview_4.png) | ![preview 5](9/preview_5.png) | ![preview 6](9/preview_6.png) | ![preview 7](9/preview_7.png) | ![preview 8](9/preview_8.png) | | 10 | 54 | [Download](10/dataset.zip) | ![preview 1](10/preview_1.png) | ![preview 2](10/preview_2.png) | ![preview 3](10/preview_3.png) | ![preview 4](10/preview_4.png) | ![preview 5](10/preview_5.png) | ![preview 6](10/preview_6.png) | ![preview 7](10/preview_7.png) | ![preview 8](10/preview_8.png) | | 11 | 789 | [Download](11/dataset.zip) | ![preview 1](11/preview_1.png) | ![preview 2](11/preview_2.png) | ![preview 3](11/preview_3.png) | ![preview 4](11/preview_4.png) | ![preview 5](11/preview_5.png) | ![preview 6](11/preview_6.png) | ![preview 7](11/preview_7.png) | ![preview 8](11/preview_8.png) | | 12 | 40 | [Download](12/dataset.zip) | ![preview 1](12/preview_1.png) | ![preview 2](12/preview_2.png) | ![preview 3](12/preview_3.png) | ![preview 4](12/preview_4.png) | ![preview 5](12/preview_5.png) | ![preview 6](12/preview_6.png) | ![preview 7](12/preview_7.png) | ![preview 8](12/preview_8.png) | | 13 | 132 | [Download](13/dataset.zip) | ![preview 1](13/preview_1.png) | ![preview 2](13/preview_2.png) | ![preview 3](13/preview_3.png) | ![preview 4](13/preview_4.png) | ![preview 5](13/preview_5.png) | ![preview 6](13/preview_6.png) | ![preview 7](13/preview_7.png) | ![preview 8](13/preview_8.png) | | 14 | 19 | [Download](14/dataset.zip) | ![preview 1](14/preview_1.png) | ![preview 2](14/preview_2.png) | ![preview 3](14/preview_3.png) | ![preview 4](14/preview_4.png) | ![preview 5](14/preview_5.png) | ![preview 6](14/preview_6.png) | ![preview 7](14/preview_7.png) | ![preview 8](14/preview_8.png) | | 15 | 12 | [Download](15/dataset.zip) | ![preview 1](15/preview_1.png) | ![preview 2](15/preview_2.png) | ![preview 3](15/preview_3.png) | ![preview 4](15/preview_4.png) | ![preview 5](15/preview_5.png) | ![preview 6](15/preview_6.png) | ![preview 7](15/preview_7.png) | ![preview 8](15/preview_8.png) | | 16 | 25 | [Download](16/dataset.zip) | ![preview 1](16/preview_1.png) | ![preview 2](16/preview_2.png) | ![preview 3](16/preview_3.png) | ![preview 4](16/preview_4.png) | ![preview 5](16/preview_5.png) | ![preview 6](16/preview_6.png) | ![preview 7](16/preview_7.png) | ![preview 8](16/preview_8.png) | | 17 | 15 | [Download](17/dataset.zip) | ![preview 1](17/preview_1.png) | ![preview 2](17/preview_2.png) | ![preview 3](17/preview_3.png) | ![preview 4](17/preview_4.png) | ![preview 5](17/preview_5.png) | ![preview 6](17/preview_6.png) | ![preview 7](17/preview_7.png) | ![preview 8](17/preview_8.png) | | 18 | 12 | [Download](18/dataset.zip) | ![preview 1](18/preview_1.png) | ![preview 2](18/preview_2.png) | ![preview 3](18/preview_3.png) | ![preview 4](18/preview_4.png) | ![preview 5](18/preview_5.png) | ![preview 6](18/preview_6.png) | ![preview 7](18/preview_7.png) | ![preview 8](18/preview_8.png) | | 19 | 26 | [Download](19/dataset.zip) | ![preview 1](19/preview_1.png) | ![preview 2](19/preview_2.png) | ![preview 3](19/preview_3.png) | ![preview 4](19/preview_4.png) | ![preview 5](19/preview_5.png) | ![preview 6](19/preview_6.png) | ![preview 7](19/preview_7.png) | ![preview 8](19/preview_8.png) | | 20 | 15 | [Download](20/dataset.zip) | ![preview 1](20/preview_1.png) | ![preview 2](20/preview_2.png) | ![preview 3](20/preview_3.png) | ![preview 4](20/preview_4.png) | ![preview 5](20/preview_5.png) | ![preview 6](20/preview_6.png) | ![preview 7](20/preview_7.png) | ![preview 8](20/preview_8.png) | | 21 | 40 | [Download](21/dataset.zip) | ![preview 1](21/preview_1.png) | ![preview 2](21/preview_2.png) | ![preview 3](21/preview_3.png) | ![preview 4](21/preview_4.png) | ![preview 5](21/preview_5.png) | ![preview 6](21/preview_6.png) | ![preview 7](21/preview_7.png) | ![preview 8](21/preview_8.png) | | 22 | 21 | [Download](22/dataset.zip) | ![preview 1](22/preview_1.png) | ![preview 2](22/preview_2.png) | ![preview 3](22/preview_3.png) | ![preview 4](22/preview_4.png) | ![preview 5](22/preview_5.png) | ![preview 6](22/preview_6.png) | ![preview 7](22/preview_7.png) | ![preview 8](22/preview_8.png) | | 23 | 60 | [Download](23/dataset.zip) | ![preview 1](23/preview_1.png) | ![preview 2](23/preview_2.png) | ![preview 3](23/preview_3.png) | ![preview 4](23/preview_4.png) | ![preview 5](23/preview_5.png) | ![preview 6](23/preview_6.png) | ![preview 7](23/preview_7.png) | ![preview 8](23/preview_8.png) | | 24 | 60 | [Download](24/dataset.zip) | ![preview 1](24/preview_1.png) | ![preview 2](24/preview_2.png) | ![preview 3](24/preview_3.png) | ![preview 4](24/preview_4.png) | ![preview 5](24/preview_5.png) | ![preview 6](24/preview_6.png) | ![preview 7](24/preview_7.png) | ![preview 8](24/preview_8.png) | | 25 | 32 | [Download](25/dataset.zip) | ![preview 1](25/preview_1.png) | ![preview 2](25/preview_2.png) | ![preview 3](25/preview_3.png) | ![preview 4](25/preview_4.png) | ![preview 5](25/preview_5.png) | ![preview 6](25/preview_6.png) | ![preview 7](25/preview_7.png) | ![preview 8](25/preview_8.png) | | 26 | 65 | [Download](26/dataset.zip) | ![preview 1](26/preview_1.png) | ![preview 2](26/preview_2.png) | ![preview 3](26/preview_3.png) | ![preview 4](26/preview_4.png) | ![preview 5](26/preview_5.png) | ![preview 6](26/preview_6.png) | ![preview 7](26/preview_7.png) | ![preview 8](26/preview_8.png) | | 27 | 27 | [Download](27/dataset.zip) | ![preview 1](27/preview_1.png) | ![preview 2](27/preview_2.png) | ![preview 3](27/preview_3.png) | ![preview 4](27/preview_4.png) | ![preview 5](27/preview_5.png) | ![preview 6](27/preview_6.png) | ![preview 7](27/preview_7.png) | ![preview 8](27/preview_8.png) | | 28 | 13 | [Download](28/dataset.zip) | ![preview 1](28/preview_1.png) | ![preview 2](28/preview_2.png) | ![preview 3](28/preview_3.png) | ![preview 4](28/preview_4.png) | ![preview 5](28/preview_5.png) | ![preview 6](28/preview_6.png) | ![preview 7](28/preview_7.png) | ![preview 8](28/preview_8.png) | | 29 | 13 | [Download](29/dataset.zip) | ![preview 1](29/preview_1.png) | ![preview 2](29/preview_2.png) | ![preview 3](29/preview_3.png) | ![preview 4](29/preview_4.png) | ![preview 5](29/preview_5.png) | ![preview 6](29/preview_6.png) | ![preview 7](29/preview_7.png) | ![preview 8](29/preview_8.png) | | 30 | 30 | [Download](30/dataset.zip) | ![preview 1](30/preview_1.png) | ![preview 2](30/preview_2.png) | ![preview 3](30/preview_3.png) | ![preview 4](30/preview_4.png) | ![preview 5](30/preview_5.png) | ![preview 6](30/preview_6.png) | ![preview 7](30/preview_7.png) | ![preview 8](30/preview_8.png) | | 31 | 16 | [Download](31/dataset.zip) | ![preview 1](31/preview_1.png) | ![preview 2](31/preview_2.png) | ![preview 3](31/preview_3.png) | ![preview 4](31/preview_4.png) | ![preview 5](31/preview_5.png) | ![preview 6](31/preview_6.png) | ![preview 7](31/preview_7.png) | ![preview 8](31/preview_8.png) | | 32 | 12 | [Download](32/dataset.zip) | ![preview 1](32/preview_1.png) | ![preview 2](32/preview_2.png) | ![preview 3](32/preview_3.png) | ![preview 4](32/preview_4.png) | ![preview 5](32/preview_5.png) | ![preview 6](32/preview_6.png) | ![preview 7](32/preview_7.png) | ![preview 8](32/preview_8.png) | | 33 | 35 | [Download](33/dataset.zip) | ![preview 1](33/preview_1.png) | ![preview 2](33/preview_2.png) | ![preview 3](33/preview_3.png) | ![preview 4](33/preview_4.png) | ![preview 5](33/preview_5.png) | ![preview 6](33/preview_6.png) | ![preview 7](33/preview_7.png) | ![preview 8](33/preview_8.png) | | 34 | 94 | [Download](34/dataset.zip) | ![preview 1](34/preview_1.png) | ![preview 2](34/preview_2.png) | ![preview 3](34/preview_3.png) | ![preview 4](34/preview_4.png) | ![preview 5](34/preview_5.png) | ![preview 6](34/preview_6.png) | ![preview 7](34/preview_7.png) | ![preview 8](34/preview_8.png) | | 35 | 207 | [Download](35/dataset.zip) | ![preview 1](35/preview_1.png) | ![preview 2](35/preview_2.png) | ![preview 3](35/preview_3.png) | ![preview 4](35/preview_4.png) | ![preview 5](35/preview_5.png) | ![preview 6](35/preview_6.png) | ![preview 7](35/preview_7.png) | ![preview 8](35/preview_8.png) | | 36 | 402 | [Download](36/dataset.zip) | ![preview 1](36/preview_1.png) | ![preview 2](36/preview_2.png) | ![preview 3](36/preview_3.png) | ![preview 4](36/preview_4.png) | ![preview 5](36/preview_5.png) | ![preview 6](36/preview_6.png) | ![preview 7](36/preview_7.png) | ![preview 8](36/preview_8.png) | | 37 | 156 | [Download](37/dataset.zip) | ![preview 1](37/preview_1.png) | ![preview 2](37/preview_2.png) | ![preview 3](37/preview_3.png) | ![preview 4](37/preview_4.png) | ![preview 5](37/preview_5.png) | ![preview 6](37/preview_6.png) | ![preview 7](37/preview_7.png) | ![preview 8](37/preview_8.png) | | 38 | 64 | [Download](38/dataset.zip) | ![preview 1](38/preview_1.png) | ![preview 2](38/preview_2.png) | ![preview 3](38/preview_3.png) | ![preview 4](38/preview_4.png) | ![preview 5](38/preview_5.png) | ![preview 6](38/preview_6.png) | ![preview 7](38/preview_7.png) | ![preview 8](38/preview_8.png) | | 39 | 23 | [Download](39/dataset.zip) | ![preview 1](39/preview_1.png) | ![preview 2](39/preview_2.png) | ![preview 3](39/preview_3.png) | ![preview 4](39/preview_4.png) | ![preview 5](39/preview_5.png) | ![preview 6](39/preview_6.png) | ![preview 7](39/preview_7.png) | ![preview 8](39/preview_8.png) | | 40 | 34 | [Download](40/dataset.zip) | ![preview 1](40/preview_1.png) | ![preview 2](40/preview_2.png) | ![preview 3](40/preview_3.png) | ![preview 4](40/preview_4.png) | ![preview 5](40/preview_5.png) | ![preview 6](40/preview_6.png) | ![preview 7](40/preview_7.png) | ![preview 8](40/preview_8.png) | | 41 | 45 | [Download](41/dataset.zip) | ![preview 1](41/preview_1.png) | ![preview 2](41/preview_2.png) | ![preview 3](41/preview_3.png) | ![preview 4](41/preview_4.png) | ![preview 5](41/preview_5.png) | ![preview 6](41/preview_6.png) | ![preview 7](41/preview_7.png) | ![preview 8](41/preview_8.png) | | 42 | 19 | [Download](42/dataset.zip) | ![preview 1](42/preview_1.png) | ![preview 2](42/preview_2.png) | ![preview 3](42/preview_3.png) | ![preview 4](42/preview_4.png) | ![preview 5](42/preview_5.png) | ![preview 6](42/preview_6.png) | ![preview 7](42/preview_7.png) | ![preview 8](42/preview_8.png) | | 43 | 26 | [Download](43/dataset.zip) | ![preview 1](43/preview_1.png) | ![preview 2](43/preview_2.png) | ![preview 3](43/preview_3.png) | ![preview 4](43/preview_4.png) | ![preview 5](43/preview_5.png) | ![preview 6](43/preview_6.png) | ![preview 7](43/preview_7.png) | ![preview 8](43/preview_8.png) | | 44 | 691 | [Download](44/dataset.zip) | ![preview 1](44/preview_1.png) | ![preview 2](44/preview_2.png) | ![preview 3](44/preview_3.png) | ![preview 4](44/preview_4.png) | ![preview 5](44/preview_5.png) | ![preview 6](44/preview_6.png) | ![preview 7](44/preview_7.png) | ![preview 8](44/preview_8.png) | | 45 | 43 | [Download](45/dataset.zip) | ![preview 1](45/preview_1.png) | ![preview 2](45/preview_2.png) | ![preview 3](45/preview_3.png) | ![preview 4](45/preview_4.png) | ![preview 5](45/preview_5.png) | ![preview 6](45/preview_6.png) | ![preview 7](45/preview_7.png) | ![preview 8](45/preview_8.png) | | 46 | 15 | [Download](46/dataset.zip) | ![preview 1](46/preview_1.png) | ![preview 2](46/preview_2.png) | ![preview 3](46/preview_3.png) | ![preview 4](46/preview_4.png) | ![preview 5](46/preview_5.png) | ![preview 6](46/preview_6.png) | ![preview 7](46/preview_7.png) | ![preview 8](46/preview_8.png) | | 47 | 373 | [Download](47/dataset.zip) | ![preview 1](47/preview_1.png) | ![preview 2](47/preview_2.png) | ![preview 3](47/preview_3.png) | ![preview 4](47/preview_4.png) | ![preview 5](47/preview_5.png) | ![preview 6](47/preview_6.png) | ![preview 7](47/preview_7.png) | ![preview 8](47/preview_8.png) | | 48 | 16 | [Download](48/dataset.zip) | ![preview 1](48/preview_1.png) | ![preview 2](48/preview_2.png) | ![preview 3](48/preview_3.png) | ![preview 4](48/preview_4.png) | ![preview 5](48/preview_5.png) | ![preview 6](48/preview_6.png) | ![preview 7](48/preview_7.png) | ![preview 8](48/preview_8.png) | | 49 | 48 | [Download](49/dataset.zip) | ![preview 1](49/preview_1.png) | ![preview 2](49/preview_2.png) | ![preview 3](49/preview_3.png) | ![preview 4](49/preview_4.png) | ![preview 5](49/preview_5.png) | ![preview 6](49/preview_6.png) | ![preview 7](49/preview_7.png) | ![preview 8](49/preview_8.png) | | 50 | 76 | [Download](50/dataset.zip) | ![preview 1](50/preview_1.png) | ![preview 2](50/preview_2.png) | ![preview 3](50/preview_3.png) | ![preview 4](50/preview_4.png) | ![preview 5](50/preview_5.png) | ![preview 6](50/preview_6.png) | ![preview 7](50/preview_7.png) | ![preview 8](50/preview_8.png) | | 51 | 22 | [Download](51/dataset.zip) | ![preview 1](51/preview_1.png) | ![preview 2](51/preview_2.png) | ![preview 3](51/preview_3.png) | ![preview 4](51/preview_4.png) | ![preview 5](51/preview_5.png) | ![preview 6](51/preview_6.png) | ![preview 7](51/preview_7.png) | ![preview 8](51/preview_8.png) | | 52 | 124 | [Download](52/dataset.zip) | ![preview 1](52/preview_1.png) | ![preview 2](52/preview_2.png) | ![preview 3](52/preview_3.png) | ![preview 4](52/preview_4.png) | ![preview 5](52/preview_5.png) | ![preview 6](52/preview_6.png) | ![preview 7](52/preview_7.png) | ![preview 8](52/preview_8.png) | | 53 | 22 | [Download](53/dataset.zip) | ![preview 1](53/preview_1.png) | ![preview 2](53/preview_2.png) | ![preview 3](53/preview_3.png) | ![preview 4](53/preview_4.png) | ![preview 5](53/preview_5.png) | ![preview 6](53/preview_6.png) | ![preview 7](53/preview_7.png) | ![preview 8](53/preview_8.png) | | 54 | 42 | [Download](54/dataset.zip) | ![preview 1](54/preview_1.png) | ![preview 2](54/preview_2.png) | ![preview 3](54/preview_3.png) | ![preview 4](54/preview_4.png) | ![preview 5](54/preview_5.png) | ![preview 6](54/preview_6.png) | ![preview 7](54/preview_7.png) | ![preview 8](54/preview_8.png) | | 55 | 6 | [Download](55/dataset.zip) | ![preview 1](55/preview_1.png) | ![preview 2](55/preview_2.png) | ![preview 3](55/preview_3.png) | ![preview 4](55/preview_4.png) | ![preview 5](55/preview_5.png) | ![preview 6](55/preview_6.png) | N/A | N/A | | 56 | 29 | [Download](56/dataset.zip) | ![preview 1](56/preview_1.png) | ![preview 2](56/preview_2.png) | ![preview 3](56/preview_3.png) | ![preview 4](56/preview_4.png) | ![preview 5](56/preview_5.png) | ![preview 6](56/preview_6.png) | ![preview 7](56/preview_7.png) | ![preview 8](56/preview_8.png) | | 57 | 114 | [Download](57/dataset.zip) | ![preview 1](57/preview_1.png) | ![preview 2](57/preview_2.png) | ![preview 3](57/preview_3.png) | ![preview 4](57/preview_4.png) | ![preview 5](57/preview_5.png) | ![preview 6](57/preview_6.png) | ![preview 7](57/preview_7.png) | ![preview 8](57/preview_8.png) | | 58 | 38 | [Download](58/dataset.zip) | ![preview 1](58/preview_1.png) | ![preview 2](58/preview_2.png) | ![preview 3](58/preview_3.png) | ![preview 4](58/preview_4.png) | ![preview 5](58/preview_5.png) | ![preview 6](58/preview_6.png) | ![preview 7](58/preview_7.png) | ![preview 8](58/preview_8.png) | | 59 | 430 | [Download](59/dataset.zip) | ![preview 1](59/preview_1.png) | ![preview 2](59/preview_2.png) | ![preview 3](59/preview_3.png) | ![preview 4](59/preview_4.png) | ![preview 5](59/preview_5.png) | ![preview 6](59/preview_6.png) | ![preview 7](59/preview_7.png) | ![preview 8](59/preview_8.png) | | 60 | 13 | [Download](60/dataset.zip) | ![preview 1](60/preview_1.png) | ![preview 2](60/preview_2.png) | ![preview 3](60/preview_3.png) | ![preview 4](60/preview_4.png) | ![preview 5](60/preview_5.png) | ![preview 6](60/preview_6.png) | ![preview 7](60/preview_7.png) | ![preview 8](60/preview_8.png) | | 61 | 17 | [Download](61/dataset.zip) | ![preview 1](61/preview_1.png) | ![preview 2](61/preview_2.png) | ![preview 3](61/preview_3.png) | ![preview 4](61/preview_4.png) | ![preview 5](61/preview_5.png) | ![preview 6](61/preview_6.png) | ![preview 7](61/preview_7.png) | ![preview 8](61/preview_8.png) | | 62 | 17 | [Download](62/dataset.zip) | ![preview 1](62/preview_1.png) | ![preview 2](62/preview_2.png) | ![preview 3](62/preview_3.png) | ![preview 4](62/preview_4.png) | ![preview 5](62/preview_5.png) | ![preview 6](62/preview_6.png) | ![preview 7](62/preview_7.png) | ![preview 8](62/preview_8.png) | | 63 | 13 | [Download](63/dataset.zip) | ![preview 1](63/preview_1.png) | ![preview 2](63/preview_2.png) | ![preview 3](63/preview_3.png) | ![preview 4](63/preview_4.png) | ![preview 5](63/preview_5.png) | ![preview 6](63/preview_6.png) | ![preview 7](63/preview_7.png) | ![preview 8](63/preview_8.png) | | 64 | 15 | [Download](64/dataset.zip) | ![preview 1](64/preview_1.png) | ![preview 2](64/preview_2.png) | ![preview 3](64/preview_3.png) | ![preview 4](64/preview_4.png) | ![preview 5](64/preview_5.png) | ![preview 6](64/preview_6.png) | ![preview 7](64/preview_7.png) | ![preview 8](64/preview_8.png) | | 65 | 22 | [Download](65/dataset.zip) | ![preview 1](65/preview_1.png) | ![preview 2](65/preview_2.png) | ![preview 3](65/preview_3.png) | ![preview 4](65/preview_4.png) | ![preview 5](65/preview_5.png) | ![preview 6](65/preview_6.png) | ![preview 7](65/preview_7.png) | ![preview 8](65/preview_8.png) | | 66 | 35 | [Download](66/dataset.zip) | ![preview 1](66/preview_1.png) | ![preview 2](66/preview_2.png) | ![preview 3](66/preview_3.png) | ![preview 4](66/preview_4.png) | ![preview 5](66/preview_5.png) | ![preview 6](66/preview_6.png) | ![preview 7](66/preview_7.png) | ![preview 8](66/preview_8.png) | | 67 | 219 | [Download](67/dataset.zip) | ![preview 1](67/preview_1.png) | ![preview 2](67/preview_2.png) | ![preview 3](67/preview_3.png) | ![preview 4](67/preview_4.png) | ![preview 5](67/preview_5.png) | ![preview 6](67/preview_6.png) | ![preview 7](67/preview_7.png) | ![preview 8](67/preview_8.png) | | 68 | 44 | [Download](68/dataset.zip) | ![preview 1](68/preview_1.png) | ![preview 2](68/preview_2.png) | ![preview 3](68/preview_3.png) | ![preview 4](68/preview_4.png) | ![preview 5](68/preview_5.png) | ![preview 6](68/preview_6.png) | ![preview 7](68/preview_7.png) | ![preview 8](68/preview_8.png) | | 69 | 44 | [Download](69/dataset.zip) | ![preview 1](69/preview_1.png) | ![preview 2](69/preview_2.png) | ![preview 3](69/preview_3.png) | ![preview 4](69/preview_4.png) | ![preview 5](69/preview_5.png) | ![preview 6](69/preview_6.png) | ![preview 7](69/preview_7.png) | ![preview 8](69/preview_8.png) | | 70 | 9 | [Download](70/dataset.zip) | ![preview 1](70/preview_1.png) | ![preview 2](70/preview_2.png) | ![preview 3](70/preview_3.png) | ![preview 4](70/preview_4.png) | ![preview 5](70/preview_5.png) | ![preview 6](70/preview_6.png) | ![preview 7](70/preview_7.png) | ![preview 8](70/preview_8.png) | | 71 | 8 | [Download](71/dataset.zip) | ![preview 1](71/preview_1.png) | ![preview 2](71/preview_2.png) | ![preview 3](71/preview_3.png) | ![preview 4](71/preview_4.png) | ![preview 5](71/preview_5.png) | ![preview 6](71/preview_6.png) | ![preview 7](71/preview_7.png) | ![preview 8](71/preview_8.png) | | 72 | 34 | [Download](72/dataset.zip) | ![preview 1](72/preview_1.png) | ![preview 2](72/preview_2.png) | ![preview 3](72/preview_3.png) | ![preview 4](72/preview_4.png) | ![preview 5](72/preview_5.png) | ![preview 6](72/preview_6.png) | ![preview 7](72/preview_7.png) | ![preview 8](72/preview_8.png) | | 73 | 47 | [Download](73/dataset.zip) | ![preview 1](73/preview_1.png) | ![preview 2](73/preview_2.png) | ![preview 3](73/preview_3.png) | ![preview 4](73/preview_4.png) | ![preview 5](73/preview_5.png) | ![preview 6](73/preview_6.png) | ![preview 7](73/preview_7.png) | ![preview 8](73/preview_8.png) | | 74 | 15 | [Download](74/dataset.zip) | ![preview 1](74/preview_1.png) | ![preview 2](74/preview_2.png) | ![preview 3](74/preview_3.png) | ![preview 4](74/preview_4.png) | ![preview 5](74/preview_5.png) | ![preview 6](74/preview_6.png) | ![preview 7](74/preview_7.png) | ![preview 8](74/preview_8.png) | | 75 | 35 | [Download](75/dataset.zip) | ![preview 1](75/preview_1.png) | ![preview 2](75/preview_2.png) | ![preview 3](75/preview_3.png) | ![preview 4](75/preview_4.png) | ![preview 5](75/preview_5.png) | ![preview 6](75/preview_6.png) | ![preview 7](75/preview_7.png) | ![preview 8](75/preview_8.png) | | 76 | 15 | [Download](76/dataset.zip) | ![preview 1](76/preview_1.png) | ![preview 2](76/preview_2.png) | ![preview 3](76/preview_3.png) | ![preview 4](76/preview_4.png) | ![preview 5](76/preview_5.png) | ![preview 6](76/preview_6.png) | ![preview 7](76/preview_7.png) | ![preview 8](76/preview_8.png) | | 77 | 17 | [Download](77/dataset.zip) | ![preview 1](77/preview_1.png) | ![preview 2](77/preview_2.png) | ![preview 3](77/preview_3.png) | ![preview 4](77/preview_4.png) | ![preview 5](77/preview_5.png) | ![preview 6](77/preview_6.png) | ![preview 7](77/preview_7.png) | ![preview 8](77/preview_8.png) | | 78 | 18 | [Download](78/dataset.zip) | ![preview 1](78/preview_1.png) | ![preview 2](78/preview_2.png) | ![preview 3](78/preview_3.png) | ![preview 4](78/preview_4.png) | ![preview 5](78/preview_5.png) | ![preview 6](78/preview_6.png) | ![preview 7](78/preview_7.png) | ![preview 8](78/preview_8.png) | | 79 | 12 | [Download](79/dataset.zip) | ![preview 1](79/preview_1.png) | ![preview 2](79/preview_2.png) | ![preview 3](79/preview_3.png) | ![preview 4](79/preview_4.png) | ![preview 5](79/preview_5.png) | ![preview 6](79/preview_6.png) | ![preview 7](79/preview_7.png) | ![preview 8](79/preview_8.png) | | 80 | 8 | [Download](80/dataset.zip) | ![preview 1](80/preview_1.png) | ![preview 2](80/preview_2.png) | ![preview 3](80/preview_3.png) | ![preview 4](80/preview_4.png) | ![preview 5](80/preview_5.png) | ![preview 6](80/preview_6.png) | ![preview 7](80/preview_7.png) | ![preview 8](80/preview_8.png) | | 81 | 9 | [Download](81/dataset.zip) | ![preview 1](81/preview_1.png) | ![preview 2](81/preview_2.png) | ![preview 3](81/preview_3.png) | ![preview 4](81/preview_4.png) | ![preview 5](81/preview_5.png) | ![preview 6](81/preview_6.png) | ![preview 7](81/preview_7.png) | ![preview 8](81/preview_8.png) | | 82 | 14 | [Download](82/dataset.zip) | ![preview 1](82/preview_1.png) | ![preview 2](82/preview_2.png) | ![preview 3](82/preview_3.png) | ![preview 4](82/preview_4.png) | ![preview 5](82/preview_5.png) | ![preview 6](82/preview_6.png) | ![preview 7](82/preview_7.png) | ![preview 8](82/preview_8.png) | | 83 | 11 | [Download](83/dataset.zip) | ![preview 1](83/preview_1.png) | ![preview 2](83/preview_2.png) | ![preview 3](83/preview_3.png) | ![preview 4](83/preview_4.png) | ![preview 5](83/preview_5.png) | ![preview 6](83/preview_6.png) | ![preview 7](83/preview_7.png) | ![preview 8](83/preview_8.png) | | 84 | 8 | [Download](84/dataset.zip) | ![preview 1](84/preview_1.png) | ![preview 2](84/preview_2.png) | ![preview 3](84/preview_3.png) | ![preview 4](84/preview_4.png) | ![preview 5](84/preview_5.png) | ![preview 6](84/preview_6.png) | ![preview 7](84/preview_7.png) | ![preview 8](84/preview_8.png) | | 85 | 11 | [Download](85/dataset.zip) | ![preview 1](85/preview_1.png) | ![preview 2](85/preview_2.png) | ![preview 3](85/preview_3.png) | ![preview 4](85/preview_4.png) | ![preview 5](85/preview_5.png) | ![preview 6](85/preview_6.png) | ![preview 7](85/preview_7.png) | ![preview 8](85/preview_8.png) | | 86 | 9 | [Download](86/dataset.zip) | ![preview 1](86/preview_1.png) | ![preview 2](86/preview_2.png) | ![preview 3](86/preview_3.png) | ![preview 4](86/preview_4.png) | ![preview 5](86/preview_5.png) | ![preview 6](86/preview_6.png) | ![preview 7](86/preview_7.png) | ![preview 8](86/preview_8.png) | | 87 | 24 | [Download](87/dataset.zip) | ![preview 1](87/preview_1.png) | ![preview 2](87/preview_2.png) | ![preview 3](87/preview_3.png) | ![preview 4](87/preview_4.png) | ![preview 5](87/preview_5.png) | ![preview 6](87/preview_6.png) | ![preview 7](87/preview_7.png) | ![preview 8](87/preview_8.png) | | 88 | 6 | [Download](88/dataset.zip) | ![preview 1](88/preview_1.png) | ![preview 2](88/preview_2.png) | ![preview 3](88/preview_3.png) | ![preview 4](88/preview_4.png) | ![preview 5](88/preview_5.png) | ![preview 6](88/preview_6.png) | N/A | N/A | | 89 | 20 | [Download](89/dataset.zip) | ![preview 1](89/preview_1.png) | ![preview 2](89/preview_2.png) | ![preview 3](89/preview_3.png) | ![preview 4](89/preview_4.png) | ![preview 5](89/preview_5.png) | ![preview 6](89/preview_6.png) | ![preview 7](89/preview_7.png) | ![preview 8](89/preview_8.png) | | 90 | 19 | [Download](90/dataset.zip) | ![preview 1](90/preview_1.png) | ![preview 2](90/preview_2.png) | ![preview 3](90/preview_3.png) | ![preview 4](90/preview_4.png) | ![preview 5](90/preview_5.png) | ![preview 6](90/preview_6.png) | ![preview 7](90/preview_7.png) | ![preview 8](90/preview_8.png) | | noise | 355 | [Download](-1/dataset.zip) | ![preview 1](-1/preview_1.png) | ![preview 2](-1/preview_2.png) | ![preview 3](-1/preview_3.png) | ![preview 4](-1/preview_4.png) | ![preview 5](-1/preview_5.png) | ![preview 6](-1/preview_6.png) | ![preview 7](-1/preview_7.png) | ![preview 8](-1/preview_8.png) |
CVasNLPExperiments/VQAv2_sample_validation_google_flan_t5_xl_mode_C_Q_rices_ns_200
--- dataset_info: features: - name: id dtype: int64 - name: question dtype: string - name: true_label sequence: string - name: prediction dtype: string splits: - name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large__ num_bytes: 28618 num_examples: 200 download_size: 0 dataset_size: 28618 --- # Dataset Card for "VQAv2_sample_validation_google_flan_t5_xl_mode_C_Q_rices_ns_200" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
iohadrubin/nq_closedbook
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* dataset_info: features: - name: question dtype: string - name: answer sequence: string splits: - name: train num_bytes: 6012212 num_examples: 79168 - name: validation num_bytes: 663176 num_examples: 8757 - name: test num_bytes: 314639 num_examples: 3610 download_size: 213903 dataset_size: 6990027 --- # Dataset Card for "nq_closedbook" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
emaeon/train1
--- dataset_info: features: - name: code1 dtype: string - name: code2 dtype: string - name: similar dtype: int64 - name: __index_level_0__ dtype: int64 splits: - name: train num_bytes: 9012765960 num_examples: 5000000 download_size: 0 dataset_size: 9012765960 --- # Dataset Card for "train1" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
jbb/coq_code
--- license: mit --- data for finetuning using the coq framework for mathematical formalization
alexcom/analisis-sentimientos-textos-turisitcos-mx-polaridad
--- dataset_info: features: - name: text dtype: string - name: label dtype: int64 splits: - name: train num_bytes: 71496873 num_examples: 176192 - name: test num_bytes: 30856228 num_examples: 75510 download_size: 62497427 dataset_size: 102353101 --- # Dataset Card for "analisis-sentimeinto-textos-turisitcos-mx-polaridad" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
sanps/GutenbergFiction
--- dataset_info: features: - name: file_id dtype: string - name: text_sub_id dtype: int64 - name: text dtype: string - name: tokens dtype: int64 splits: - name: train num_bytes: 1696432957 num_examples: 393386 download_size: 1069041271 dataset_size: 1696432957 configs: - config_name: default data_files: - split: train path: data/train-* language: - en --- English books from gutenberg.org with fiction tag and at least 25 downloads, split into paragraphs. For license details see: https://www.gutenberg.org/policy/permission.html
salma-remyx/ffmperative_sample_10k
--- dataset_info: features: - name: prompt dtype: string - name: response dtype: string splits: - name: train num_bytes: 4049349 num_examples: 10000 download_size: 1276340 dataset_size: 4049349 --- # Dataset Card for "ffmperative_sample_10k" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Elise-hf/PwC
--- dataset_info: features: - name: uid dtype: int64 - name: paper_url dtype: string - name: arxiv_id dtype: string - name: title dtype: string - name: abstract dtype: string - name: url_abs dtype: string - name: url_pdf dtype: string - name: proceeding dtype: string - name: authors sequence: string - name: tasks sequence: string - name: date dtype: float64 - name: methods list: - name: code_snippet_url dtype: string - name: description dtype: string - name: full_name dtype: string - name: introduced_year dtype: int64 - name: main_collection struct: - name: area dtype: string - name: description dtype: string - name: name dtype: string - name: parent dtype: string - name: name dtype: string - name: source_title dtype: string - name: source_url dtype: string - name: __index_level_0__ dtype: int64 splits: - name: train num_bytes: 437349959 num_examples: 149495 - name: test num_bytes: 110099655 num_examples: 37108 download_size: 183963479 dataset_size: 547449614 --- # Dataset Card for "PwC" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_Gille__StrangeMerges_48-7B-dare_ties
--- pretty_name: Evaluation run of Gille/StrangeMerges_48-7B-dare_ties dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Gille/StrangeMerges_48-7B-dare_ties](https://huggingface.co/Gille/StrangeMerges_48-7B-dare_ties)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gille__StrangeMerges_48-7B-dare_ties\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-04-02T20:33:07.066186](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_48-7B-dare_ties/blob/main/results_2024-04-02T20-33-07.066186.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4968539388371313,\n\ \ \"acc_stderr\": 0.03444887235423934,\n \"acc_norm\": 0.5022418509336306,\n\ \ \"acc_norm_stderr\": 0.035226049628712106,\n \"mc1\": 0.4920440636474908,\n\ \ \"mc1_stderr\": 0.017501285074551835,\n \"mc2\": 0.6554830835773625,\n\ \ \"mc2_stderr\": 0.015261795466175352\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5827645051194539,\n \"acc_stderr\": 0.014409825518403082,\n\ \ \"acc_norm\": 0.6092150170648464,\n \"acc_norm_stderr\": 0.014258563880513782\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5912168890659231,\n\ \ \"acc_stderr\": 0.004906043613013399,\n \"acc_norm\": 0.8013343955387373,\n\ \ \"acc_norm_stderr\": 0.003981802822377581\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \ \ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\ \ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.4740740740740741,\n\ \ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5197368421052632,\n \"acc_stderr\": 0.040657710025626036,\n\ \ \"acc_norm\": 0.5197368421052632,\n \"acc_norm_stderr\": 0.040657710025626036\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\ \ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \ \ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.5320754716981132,\n \"acc_stderr\": 0.03070948699255655,\n\ \ \"acc_norm\": 0.5320754716981132,\n \"acc_norm_stderr\": 0.03070948699255655\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5694444444444444,\n\ \ \"acc_stderr\": 0.04140685639111503,\n \"acc_norm\": 0.5694444444444444,\n\ \ \"acc_norm_stderr\": 0.04140685639111503\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\ \ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\ \ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\ \ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808777,\n\ \ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808777\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n\ \ \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.4085106382978723,\n \"acc_stderr\": 0.03213418026701576,\n\ \ \"acc_norm\": 0.4085106382978723,\n \"acc_norm_stderr\": 0.03213418026701576\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\ \ \"acc_stderr\": 0.04630653203366596,\n \"acc_norm\": 0.41228070175438597,\n\ \ \"acc_norm_stderr\": 0.04630653203366596\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\ \ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3439153439153439,\n \"acc_stderr\": 0.024464426625596426,\n \"\ acc_norm\": 0.3439153439153439,\n \"acc_norm_stderr\": 0.024464426625596426\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\ \ \"acc_stderr\": 0.04240799327574925,\n \"acc_norm\": 0.3412698412698413,\n\ \ \"acc_norm_stderr\": 0.04240799327574925\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5258064516129032,\n\ \ \"acc_stderr\": 0.02840609505765332,\n \"acc_norm\": 0.5258064516129032,\n\ \ \"acc_norm_stderr\": 0.02840609505765332\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.3793103448275862,\n \"acc_stderr\": 0.03413963805906235,\n\ \ \"acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.03413963805906235\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\ : 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.38181818181818183,\n \"acc_stderr\": 0.037937131711656344,\n\ \ \"acc_norm\": 0.38181818181818183,\n \"acc_norm_stderr\": 0.037937131711656344\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.6212121212121212,\n \"acc_stderr\": 0.03456088731993747,\n \"\ acc_norm\": 0.6212121212121212,\n \"acc_norm_stderr\": 0.03456088731993747\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.7253886010362695,\n \"acc_stderr\": 0.03221024508041153,\n\ \ \"acc_norm\": 0.7253886010362695,\n \"acc_norm_stderr\": 0.03221024508041153\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.4794871794871795,\n \"acc_stderr\": 0.025329663163489943,\n\ \ \"acc_norm\": 0.4794871794871795,\n \"acc_norm_stderr\": 0.025329663163489943\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895991,\n \ \ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895991\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.4831932773109244,\n \"acc_stderr\": 0.03246013680375308,\n \ \ \"acc_norm\": 0.4831932773109244,\n \"acc_norm_stderr\": 0.03246013680375308\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\ acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.6990825688073394,\n \"acc_stderr\": 0.019664751366802114,\n \"\ acc_norm\": 0.6990825688073394,\n \"acc_norm_stderr\": 0.019664751366802114\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.3194444444444444,\n \"acc_stderr\": 0.03179876342176851,\n \"\ acc_norm\": 0.3194444444444444,\n \"acc_norm_stderr\": 0.03179876342176851\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.47058823529411764,\n \"acc_stderr\": 0.035032352963679916,\n \"\ acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.035032352963679916\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.6371308016877637,\n \"acc_stderr\": 0.031299208255302136,\n \ \ \"acc_norm\": 0.6371308016877637,\n \"acc_norm_stderr\": 0.031299208255302136\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5515695067264574,\n\ \ \"acc_stderr\": 0.03337883736255098,\n \"acc_norm\": 0.5515695067264574,\n\ \ \"acc_norm_stderr\": 0.03337883736255098\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.5725190839694656,\n \"acc_stderr\": 0.04338920305792401,\n\ \ \"acc_norm\": 0.5725190839694656,\n \"acc_norm_stderr\": 0.04338920305792401\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.6942148760330579,\n \"acc_stderr\": 0.04205953933884123,\n \"\ acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.04205953933884123\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5925925925925926,\n\ \ \"acc_stderr\": 0.04750077341199984,\n \"acc_norm\": 0.5925925925925926,\n\ \ \"acc_norm_stderr\": 0.04750077341199984\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.5276073619631901,\n \"acc_stderr\": 0.0392237829061099,\n\ \ \"acc_norm\": 0.5276073619631901,\n \"acc_norm_stderr\": 0.0392237829061099\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\ \ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\ \ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.6407766990291263,\n \"acc_stderr\": 0.04750458399041696,\n\ \ \"acc_norm\": 0.6407766990291263,\n \"acc_norm_stderr\": 0.04750458399041696\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7521367521367521,\n\ \ \"acc_stderr\": 0.02828632407556441,\n \"acc_norm\": 0.7521367521367521,\n\ \ \"acc_norm_stderr\": 0.02828632407556441\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \ \ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6781609195402298,\n\ \ \"acc_stderr\": 0.016706381415057897,\n \"acc_norm\": 0.6781609195402298,\n\ \ \"acc_norm_stderr\": 0.016706381415057897\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.569364161849711,\n \"acc_stderr\": 0.026658800273672383,\n\ \ \"acc_norm\": 0.569364161849711,\n \"acc_norm_stderr\": 0.026658800273672383\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3027932960893855,\n\ \ \"acc_stderr\": 0.01536686038639711,\n \"acc_norm\": 0.3027932960893855,\n\ \ \"acc_norm_stderr\": 0.01536686038639711\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.5130718954248366,\n \"acc_stderr\": 0.028620130800700246,\n\ \ \"acc_norm\": 0.5130718954248366,\n \"acc_norm_stderr\": 0.028620130800700246\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5305466237942122,\n\ \ \"acc_stderr\": 0.028345045864840622,\n \"acc_norm\": 0.5305466237942122,\n\ \ \"acc_norm_stderr\": 0.028345045864840622\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.027744313443376536,\n\ \ \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.027744313443376536\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.41843971631205673,\n \"acc_stderr\": 0.029427994039419994,\n \ \ \"acc_norm\": 0.41843971631205673,\n \"acc_norm_stderr\": 0.029427994039419994\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34810951760104303,\n\ \ \"acc_stderr\": 0.012166738993698191,\n \"acc_norm\": 0.34810951760104303,\n\ \ \"acc_norm_stderr\": 0.012166738993698191\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.33088235294117646,\n \"acc_stderr\": 0.02858270975389844,\n\ \ \"acc_norm\": 0.33088235294117646,\n \"acc_norm_stderr\": 0.02858270975389844\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.477124183006536,\n \"acc_stderr\": 0.020206653187884786,\n \ \ \"acc_norm\": 0.477124183006536,\n \"acc_norm_stderr\": 0.020206653187884786\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\ \ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\ \ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.5877551020408164,\n \"acc_stderr\": 0.03151236044674268,\n\ \ \"acc_norm\": 0.5877551020408164,\n \"acc_norm_stderr\": 0.03151236044674268\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6517412935323383,\n\ \ \"acc_stderr\": 0.033687874661154596,\n \"acc_norm\": 0.6517412935323383,\n\ \ \"acc_norm_stderr\": 0.033687874661154596\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \ \ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\ \ \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n\ \ \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.6900584795321637,\n \"acc_stderr\": 0.035469769593931624,\n\ \ \"acc_norm\": 0.6900584795321637,\n \"acc_norm_stderr\": 0.035469769593931624\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4920440636474908,\n\ \ \"mc1_stderr\": 0.017501285074551835,\n \"mc2\": 0.6554830835773625,\n\ \ \"mc2_stderr\": 0.015261795466175352\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7584846093133386,\n \"acc_stderr\": 0.012028983782011874\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.15390447308567096,\n \ \ \"acc_stderr\": 0.009939799304049032\n }\n}\n```" repo_url: https://huggingface.co/Gille/StrangeMerges_48-7B-dare_ties leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|arc:challenge|25_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-04-02T20-33-07.066186.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|gsm8k|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hellaswag|10_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-management|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-management|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-04-02T20-33-07.066186.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-international_law|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-management|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-marketing|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-sociology|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-virology|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T20-33-07.066186.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|truthfulqa:mc|0_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-04-02T20-33-07.066186.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_04_02T20_33_07.066186 path: - '**/details_harness|winogrande|5_2024-04-02T20-33-07.066186.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-04-02T20-33-07.066186.parquet' - config_name: results data_files: - split: 2024_04_02T20_33_07.066186 path: - results_2024-04-02T20-33-07.066186.parquet - split: latest path: - results_2024-04-02T20-33-07.066186.parquet --- # Dataset Card for Evaluation run of Gille/StrangeMerges_48-7B-dare_ties <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_48-7B-dare_ties](https://huggingface.co/Gille/StrangeMerges_48-7B-dare_ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Gille__StrangeMerges_48-7B-dare_ties", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-04-02T20:33:07.066186](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_48-7B-dare_ties/blob/main/results_2024-04-02T20-33-07.066186.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.4968539388371313, "acc_stderr": 0.03444887235423934, "acc_norm": 0.5022418509336306, "acc_norm_stderr": 0.035226049628712106, "mc1": 0.4920440636474908, "mc1_stderr": 0.017501285074551835, "mc2": 0.6554830835773625, "mc2_stderr": 0.015261795466175352 }, "harness|arc:challenge|25": { "acc": 0.5827645051194539, "acc_stderr": 0.014409825518403082, "acc_norm": 0.6092150170648464, "acc_norm_stderr": 0.014258563880513782 }, "harness|hellaswag|10": { "acc": 0.5912168890659231, "acc_stderr": 0.004906043613013399, "acc_norm": 0.8013343955387373, "acc_norm_stderr": 0.003981802822377581 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.27, "acc_stderr": 0.044619604333847415, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847415 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4740740740740741, "acc_stderr": 0.04313531696750575, "acc_norm": 0.4740740740740741, "acc_norm_stderr": 0.04313531696750575 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5197368421052632, "acc_stderr": 0.040657710025626036, "acc_norm": 0.5197368421052632, "acc_norm_stderr": 0.040657710025626036 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5320754716981132, "acc_stderr": 0.03070948699255655, "acc_norm": 0.5320754716981132, "acc_norm_stderr": 0.03070948699255655 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5694444444444444, "acc_stderr": 0.04140685639111503, "acc_norm": 0.5694444444444444, "acc_norm_stderr": 0.04140685639111503 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5144508670520231, "acc_stderr": 0.03810871630454764, "acc_norm": 0.5144508670520231, "acc_norm_stderr": 0.03810871630454764 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.2647058823529412, "acc_stderr": 0.04389869956808777, "acc_norm": 0.2647058823529412, "acc_norm_stderr": 0.04389869956808777 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.66, "acc_stderr": 0.04760952285695238, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695238 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4085106382978723, "acc_stderr": 0.03213418026701576, "acc_norm": 0.4085106382978723, "acc_norm_stderr": 0.03213418026701576 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.41228070175438597, "acc_stderr": 0.04630653203366596, "acc_norm": 0.41228070175438597, "acc_norm_stderr": 0.04630653203366596 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.47586206896551725, "acc_stderr": 0.041618085035015295, "acc_norm": 0.47586206896551725, "acc_norm_stderr": 0.041618085035015295 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3439153439153439, "acc_stderr": 0.024464426625596426, "acc_norm": 0.3439153439153439, "acc_norm_stderr": 0.024464426625596426 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3412698412698413, "acc_stderr": 0.04240799327574925, "acc_norm": 0.3412698412698413, "acc_norm_stderr": 0.04240799327574925 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5258064516129032, "acc_stderr": 0.02840609505765332, "acc_norm": 0.5258064516129032, "acc_norm_stderr": 0.02840609505765332 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3793103448275862, "acc_stderr": 0.03413963805906235, "acc_norm": 0.3793103448275862, "acc_norm_stderr": 0.03413963805906235 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.38181818181818183, "acc_stderr": 0.037937131711656344, "acc_norm": 0.38181818181818183, "acc_norm_stderr": 0.037937131711656344 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6212121212121212, "acc_stderr": 0.03456088731993747, "acc_norm": 0.6212121212121212, "acc_norm_stderr": 0.03456088731993747 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7253886010362695, "acc_stderr": 0.03221024508041153, "acc_norm": 0.7253886010362695, "acc_norm_stderr": 0.03221024508041153 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4794871794871795, "acc_stderr": 0.025329663163489943, "acc_norm": 0.4794871794871795, "acc_norm_stderr": 0.025329663163489943 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2518518518518518, "acc_stderr": 0.02646611753895991, "acc_norm": 0.2518518518518518, "acc_norm_stderr": 0.02646611753895991 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.4831932773109244, "acc_stderr": 0.03246013680375308, "acc_norm": 0.4831932773109244, "acc_norm_stderr": 0.03246013680375308 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.6990825688073394, "acc_stderr": 0.019664751366802114, "acc_norm": 0.6990825688073394, "acc_norm_stderr": 0.019664751366802114 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.3194444444444444, "acc_stderr": 0.03179876342176851, "acc_norm": 0.3194444444444444, "acc_norm_stderr": 0.03179876342176851 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.47058823529411764, "acc_stderr": 0.035032352963679916, "acc_norm": 0.47058823529411764, "acc_norm_stderr": 0.035032352963679916 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6371308016877637, "acc_stderr": 0.031299208255302136, "acc_norm": 0.6371308016877637, "acc_norm_stderr": 0.031299208255302136 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5515695067264574, "acc_stderr": 0.03337883736255098, "acc_norm": 0.5515695067264574, "acc_norm_stderr": 0.03337883736255098 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5725190839694656, "acc_stderr": 0.04338920305792401, "acc_norm": 0.5725190839694656, "acc_norm_stderr": 0.04338920305792401 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6942148760330579, "acc_stderr": 0.04205953933884123, "acc_norm": 0.6942148760330579, "acc_norm_stderr": 0.04205953933884123 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5925925925925926, "acc_stderr": 0.04750077341199984, "acc_norm": 0.5925925925925926, "acc_norm_stderr": 0.04750077341199984 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5276073619631901, "acc_stderr": 0.0392237829061099, "acc_norm": 0.5276073619631901, "acc_norm_stderr": 0.0392237829061099 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4642857142857143, "acc_stderr": 0.04733667890053756, "acc_norm": 0.4642857142857143, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.6407766990291263, "acc_stderr": 0.04750458399041696, "acc_norm": 0.6407766990291263, "acc_norm_stderr": 0.04750458399041696 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7521367521367521, "acc_stderr": 0.02828632407556441, "acc_norm": 0.7521367521367521, "acc_norm_stderr": 0.02828632407556441 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6781609195402298, "acc_stderr": 0.016706381415057897, "acc_norm": 0.6781609195402298, "acc_norm_stderr": 0.016706381415057897 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.569364161849711, "acc_stderr": 0.026658800273672383, "acc_norm": 0.569364161849711, "acc_norm_stderr": 0.026658800273672383 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3027932960893855, "acc_stderr": 0.01536686038639711, "acc_norm": 0.3027932960893855, "acc_norm_stderr": 0.01536686038639711 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5130718954248366, "acc_stderr": 0.028620130800700246, "acc_norm": 0.5130718954248366, "acc_norm_stderr": 0.028620130800700246 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5305466237942122, "acc_stderr": 0.028345045864840622, "acc_norm": 0.5305466237942122, "acc_norm_stderr": 0.028345045864840622 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5370370370370371, "acc_stderr": 0.027744313443376536, "acc_norm": 0.5370370370370371, "acc_norm_stderr": 0.027744313443376536 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.41843971631205673, "acc_stderr": 0.029427994039419994, "acc_norm": 0.41843971631205673, "acc_norm_stderr": 0.029427994039419994 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.34810951760104303, "acc_stderr": 0.012166738993698191, "acc_norm": 0.34810951760104303, "acc_norm_stderr": 0.012166738993698191 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.33088235294117646, "acc_stderr": 0.02858270975389844, "acc_norm": 0.33088235294117646, "acc_norm_stderr": 0.02858270975389844 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.477124183006536, "acc_stderr": 0.020206653187884786, "acc_norm": 0.477124183006536, "acc_norm_stderr": 0.020206653187884786 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5877551020408164, "acc_stderr": 0.03151236044674268, "acc_norm": 0.5877551020408164, "acc_norm_stderr": 0.03151236044674268 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6517412935323383, "acc_stderr": 0.033687874661154596, "acc_norm": 0.6517412935323383, "acc_norm_stderr": 0.033687874661154596 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.68, "acc_stderr": 0.04688261722621504, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-virology|5": { "acc": 0.41566265060240964, "acc_stderr": 0.038367221765980515, "acc_norm": 0.41566265060240964, "acc_norm_stderr": 0.038367221765980515 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.6900584795321637, "acc_stderr": 0.035469769593931624, "acc_norm": 0.6900584795321637, "acc_norm_stderr": 0.035469769593931624 }, "harness|truthfulqa:mc|0": { "mc1": 0.4920440636474908, "mc1_stderr": 0.017501285074551835, "mc2": 0.6554830835773625, "mc2_stderr": 0.015261795466175352 }, "harness|winogrande|5": { "acc": 0.7584846093133386, "acc_stderr": 0.012028983782011874 }, "harness|gsm8k|5": { "acc": 0.15390447308567096, "acc_stderr": 0.009939799304049032 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
Ramyaa/FCE
--- license: other ---
CyberHarem/misaana_farrengram_kumakumakumabear
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of Misaana Farrengram This is the dataset of Misaana Farrengram, containing 135 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------| | raw | 135 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 282 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | 384x512 | 135 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x512 | 135 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. | | 512x704 | 135 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x640 | 135 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. | | 640x880 | 135 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 282 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 282 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-1200 | 282 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
RiazHussain/indian_food_images
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* dataset_info: features: - name: image dtype: image - name: label dtype: class_label: names: '0': burger '1': butter_naan '2': chai '3': chapati '4': chole_bhature '5': dal_makhani '6': dhokla '7': fried_rice '8': idli '9': jalebi '10': kaathi_rolls '11': kadai_paneer '12': kulfi '13': masala_dosa '14': momos '15': paani_puri '16': pakode '17': pav_bhaji '18': pizza '19': samosa splits: - name: train num_bytes: 1370201244.9594336 num_examples: 5328 - name: test num_bytes: 208936489.3925666 num_examples: 941 download_size: 1601617594 dataset_size: 1579137734.3520002 --- # Dataset Card for "indian_food_images" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
florath/coq-facts-props-proofs-gen0-v1
--- license: other language: - code task_categories: - text-generation tags: - mathematics - formal-proof pretty_name: Coq Facts, Propositions and Proofs size_categories: - 100K<n<1M source_datasets: - https://github.com/coq-community/autosubst.git - https://gitlab.inria.fr/fbesson/itauto.git - https://github.com/coq-community/apery.git - https://github.com/coq-contribs/dep-map.git - https://github.com/coq-community/math-classes.git - https://github.com/coq-contribs/amm11262.git - https://gitlab.inria.fr/flocq/flocq.git - https://github.com/uds-psl/coq-library-undecidability.git - https://github.com/vafeiadis/hahn.git - https://github.com/coq-community/trocq.git - https://github.com/coq-community/tarjan.git - https://github.com/coq-community/generic-environments.git - https://github.com/imdea-software/htt.git - https://gitlab.mpi-sws.org/iris/stdpp.git - https://github.com/coq-contribs/coinductive-reals.git - https://github.com/coq-community/bertrand.git - https://github.com/coq-contribs/hoare-tut.git - https://github.com/coq-contribs/cfgv.git - https://github.com/artagnon/bonak.git - https://github.com/coq-community/coq-ext-lib.git - https://github.com/coq-contribs/checker.git - https://gitlab.inria.fr/coqinterval/interval.git - https://gitlab.inria.fr/gappa/coq.git - https://github.com/imdea-software/fcsl-pcm.git - https://github.com/coq-community/proviola.git - https://gitlab.com/umb-svl/turing.git - https://github.com/damien-pous/relation-algebra.git - https://github.com/coq-community/HighSchoolGeometry.git - https://github.com/coq-community/coqeal.git - https://github.com/coq-community/semantics.git - https://github.com/coq-contribs/algebra.git - https://github.com/coq-contribs/abp.git - https://github.com/coq/coq.git - https://github.com/coq-contribs/demos.git - https://github.com/coq-community/pocklington.git - https://github.com/coq-contribs/cantor.git - https://github.com/coq-community/notation-gallery.git - https://github.com/coq-contribs/ctltctl.git - https://github.com/HoTT/Coq-HoTT.git - https://github.com/coq-community/corn.git - https://github.com/affeldt-aist/monae.git - https://github.com/coq-community/stalmarck.git - https://github.com/coq-community/qarith-stern-brocot.git - https://github.com/coq-contribs/bdds.git - https://github.com/DeepSpec/InteractionTrees.git - https://github.com/coq-community/lemma-overloading.git - https://github.com/coq-community/coq-art.git - https://github.com/coq-community/coq-performance-tests.git - https://github.com/impermeable/coq-waterproof.git - https://github.com/coq-contribs/dblib.git - https://github.com/math-comp/algebra-tactics.git - https://github.com/coq-community/zorns-lemma.git - https://github.com/smtcoq/smtcoq.git - https://gitlab.inria.fr/coquelicot/coquelicot.git - https://github.com/coq-community/sudoku.git - https://github.com/QuickChick/QuickChick.git - https://github.com/coq-contribs/ails.git - https://github.com/math-comp/math-comp.git - https://github.com/plclub/metalib.git - https://github.com/xavierleroy/cdf-mech-sem.git - https://github.com/CertiGraph/CertiGraph.git - https://github.com/math-comp/Coq-Combi.git - https://github.com/coq-contribs/additions.git - https://github.com/coq-community/regexp-Brzozowski.git - https://github.com/math-comp/finmap.git - https://github.com/coq-community/huffman.git - https://github.com/mattam82/Coq-Equations.git - https://github.com/math-comp/multinomials.git - https://github.com/roglo/puiseuxth.git - https://github.com/thery/FlocqLecture.git - https://github.com/math-comp/odd-order.git - https://github.com/uwplse/verdi.git - https://github.com/mit-plv/fiat.git - https://github.com/adampetcher/fcf.git - https://github.com/uncomputable/natural-number-game.git - https://github.com/mit-plv/bbv.git - https://github.com/coq-community/fourcolor.git - https://github.com/coq-community/coqtail-math.git - https://github.com/coq-community/almost-full.git - https://github.com/coq-contribs/dictionaries.git - https://github.com/clarksmr/sf-lectures.git - https://github.com/affeldt-aist/infotheo.git - https://github.com/coq-community/gaia.git - https://github.com/coq-contribs/circuits.git - https://github.com/coq-contribs/coq-in-coq.git - https://github.com/coq-community/hoare-tut.git - https://github.com/thery/minirubik.git - https://github.com/coq-contribs/coinductive-examples.git - https://github.com/coq-community/bits.git - https://github.com/coq-community/graph-theory.git - https://github.com/coq-community/hydra-battles.git - https://github.com/coq-community/coq-100-theorems.git - https://github.com/coq-community/metaprogramming-rosetta-stone.git - https://github.com/coq-community/buchberger.git - https://github.com/math-comp/Abel.git - https://github.com/thery/mathcomp-extra.git - https://github.com/coq-contribs/coalgebras.git - https://github.com/thery/coqprime.git - https://github.com/coq-community/atbr.git - https://github.com/coq-community/bignums.git - https://github.com/coq-community/aac-tactics.git - https://github.com/codyroux/name-the-biggest-number.git - https://github.com/math-comp/trajectories.git - https://github.com/coq-contribs/concat.git - https://github.com/math-comp/real-closed.git - https://github.com/lukaszcz/coqhammer.git - https://github.com/jwiegley/category-theory.git - https://github.com/math-comp/dioid.git - https://github.com/jwiegley/coq-haskell.git - https://github.com/math-comp/analysis.git - https://github.com/coq-community/dblib.git - https://github.com/coq-community/topology.git - https://github.com/coq-contribs/automata.git - https://github.com/coq-community/reglang.git - https://github.com/thery/T2048.git - https://github.com/lthms/FreeSpec.git - https://github.com/charguer/tlc.git - https://github.com/tchajed/coq-record-update.git - https://github.com/coq-contribs/classical-realizability.git - https://github.com/coq-community/comp-dec-modal.git - https://github.com/coq-community/coqoban.git - https://github.com/coq-community/parseque.git - https://github.com/UniMath/UniMath.git - https://github.com/coq-contribs/distributed-reference-counting.git - https://github.com/coq-community/jmlcoq.git - https://github.com/SSProve/ssprove.git - https://github.com/coq-community/chapar.git - https://github.com/coq-community/alea.git - https://github.com/coq-community/paramcoq.git - https://github.com/snu-sf/paco.git - https://github.com/math-comp/mczify.git - https://github.com/ilyasergey/pnp.git - https://github.com/Mtac2/Mtac2.git - https://github.com/Deducteam/coq-hol-light.git - https://github.com/coq-community/coqffi.git - https://github.com/math-comp/tutorial_material.git - https://github.com/GeoCoq/GeoCoq.git - https://github.com/math-comp/bigenough.git - https://github.com/thery/hanoi.git - https://github.com/coq-community/coq-mmaps.git - https://github.com/fblanqui/color.git - https://github.com/Matafou/LibHyps.git --- # Dataset Name: Coq Facts, Propositions and Proofs ## Dataset Description The CoqFactsPropsProofs dataset aims to enhance Large Language Models' (LLMs) proficiency in interpreting and generating Coq code by providing a comprehensive collection of over 10,000 Coq source files. It encompasses a wide array of propositions, proofs, and definitions, enriched with metadata including source references and licensing information. This dataset is designed to facilitate the development of models capable of generating syntactically correct and semantically meaningful Coq constructs, thereby advancing automated theorem proving. A detailed description can be found in the paper [Enhancing Formal Theorem Proving: A Comprehensive Dataset for Training AI Models on Coq Code](https://arxiv.org/abs/2403.12627). ## Composition * Files: Over 10,000 Coq source files (`.v` files). * Tables: Three distinct tables: facts (definitions or notations), propositions (theorems and lemmas alongside proofs), and licensing/repository information. * Entries: 103,446 facts and 166,035 propositions with proofs. * Size: Character length ranging from as short as 11 to as long as 177,585 characters. * Source and Collection Method: The Coq source files were collected from various internet sources, focusing on repositories pivotal within the Coq community. These sources range from foundational libraries and formalized mathematical theorems to computer science theories and algorithm implementations. The collection process prioritized quality, relevance, and the contribution of each source to the Coq ecosystem. ## Licenses The dataset includes a diverse range of open-source licenses, reflecting the variety in the Coq community and the broader open-source ecosystem. Some of the licenses included are MIT, GPL (versions 2.0 and 3.0), LGPL (versions 2.1 and 3.0), Apache 2.0, BSD (2-Clause and 3-Clause), CECILL (versions 1.0, 2.1, B, C), MPL-2.0, and the UniMath License. Each entry in the dataset links to detailed license information, ensuring compliance and redistribution legality. ## Usage This dataset is provided in three parquet files and can be employed in a variety of ways depending on the use case. An example usage case includes training or fine-tuning models to focus on proofs rather than definitions and notations. The dataset also allows for filtering based on specific licenses using the `info.parquet` file. ```python import pandas as pd df_facts_raw = pd.read_parquet("facts.parquet") df_info = pd.read_parquet("info.parquet") # This is the list of licenses which might be seen as permissive permissive_licenses_list = [ 'Apache-2.0', 'BSD-2-Clause', 'BSD-3-Clause', 'CECILL-B', 'CECILL-C', 'LGPL-2.1-only', 'LGPL-2.1-or-later', 'LGPL-3.0-only', 'LGPL-3.0-or-later', 'MIT', 'MPL-2.0', 'UniMath' ] # Set the license-type to permissive based on the list df_info['license-type'] = df_info['spdx-id'].apply( lambda x: 'permissive' if x in permissive_licenses_list else 'not permissive') # Merge df_facts with df_info to get the license-type information # 'symbolic_name' is the common key in both DataFrames df_facts_merged = pd.merge(df_facts_raw, df_info, on='symbolic_name', how='left') # Filter the merged DataFrame to only include entries with a permissive license df_facts = df_facts_merged[df_facts_merged['license-type'] == 'permissive'] ``` ## Experiments and Findings Initial experiments with the dataset have demonstrated its potential in improving the accuracy of LLMs in generating Coq code. Fine-tuning an existing base model with this dataset resulted in outputs predominantly in Coq syntax, highlighting the dataset's efficacy in specialized model training for formal theorem proving. ## Challenges and Limitations * High standard deviations in fact, proposition, and proof lengths indicate the presence of outliers with significantly long content. * The complexity of licensing and the manual process of license identification for each repository. ## Cite ``` @misc{florath2024enhancing, title={Enhancing Formal Theorem Proving: A Comprehensive Dataset for Training AI Models on Coq Code}, author={Andreas Florath}, year={2024}, eprint={2403.12627}, archivePrefix={arXiv}, primaryClass={cs.AI} } ``` ## Legal Disclaimer This dataset is provided 'as is' and without any warranty or guarantee of accuracy, completeness, or compliance with any specific legal regime. While every effort has been made to ensure that license information is accurate and up-to-date, users of this dataset are responsible for verifying the licensing information of each snippet and complying with all applicable licenses and copyright laws. Users should consider seeking legal advice to ensure their use of these snippets complies with the original authors' licensing terms and any other applicable regulations. The creators of this dataset shall not be held liable for any infringements or legal challenges arising from the use of or reliance on any materials contained within this dataset.
open-llm-leaderboard/details_CorticalStack__neurotic-crown-clown-7b-ties
--- pretty_name: Evaluation run of CorticalStack/neurotic-crown-clown-7b-ties dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [CorticalStack/neurotic-crown-clown-7b-ties](https://huggingface.co/CorticalStack/neurotic-crown-clown-7b-ties)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CorticalStack__neurotic-crown-clown-7b-ties\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-02-23T22:57:05.919937](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__neurotic-crown-clown-7b-ties/blob/main/results_2024-02-23T22-57-05.919937.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6539276099234417,\n\ \ \"acc_stderr\": 0.032053817237704396,\n \"acc_norm\": 0.6530475659283497,\n\ \ \"acc_norm_stderr\": 0.03273053707433846,\n \"mc1\": 0.5960832313341493,\n\ \ \"mc1_stderr\": 0.017177276822584284,\n \"mc2\": 0.7650119704368432,\n\ \ \"mc2_stderr\": 0.013830432253272741\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.7090443686006825,\n \"acc_stderr\": 0.01327307786590759,\n\ \ \"acc_norm\": 0.7235494880546075,\n \"acc_norm_stderr\": 0.013069662474252423\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7000597490539733,\n\ \ \"acc_stderr\": 0.004572949924250631,\n \"acc_norm\": 0.8860784704242183,\n\ \ \"acc_norm_stderr\": 0.0031706661225176544\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\ \ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\ \ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\ \ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\ \ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \ \ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n\ \ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\ \ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\ \ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \ \ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n\ \ \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\ \ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n\ \ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\ \ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n\ \ \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\ \ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\ \ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\ \ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\ \ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944433,\n \"\ acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944433\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\ \ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\ \ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\ \ \"acc_stderr\": 0.023664216671642514,\n \"acc_norm\": 0.7774193548387097,\n\ \ \"acc_norm_stderr\": 0.023664216671642514\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\ \ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\ : 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\ \ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\ : 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\ \ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \ \ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\ \ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971125,\n\ \ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971125\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683512,\n \ \ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683512\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188704,\n \ \ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188704\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\ acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\ acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\ acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250447,\n \"\ acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250447\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \ \ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\ \ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\ \ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\ \ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\ : 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\ \ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\ \ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\ \ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\ \ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\ \ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \ \ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\ \ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\ \ \"acc_stderr\": 0.021586494001281365,\n \"acc_norm\": 0.8760683760683761,\n\ \ \"acc_norm_stderr\": 0.021586494001281365\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n\ \ \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n\ \ \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n\ \ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4558659217877095,\n\ \ \"acc_stderr\": 0.016657229424586313,\n \"acc_norm\": 0.4558659217877095,\n\ \ \"acc_norm_stderr\": 0.016657229424586313\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\ \ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\ \ \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n\ \ \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.02465968518596728,\n\ \ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.02465968518596728\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \ \ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4745762711864407,\n\ \ \"acc_stderr\": 0.012753716929101004,\n \"acc_norm\": 0.4745762711864407,\n\ \ \"acc_norm_stderr\": 0.012753716929101004\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\ \ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \ \ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\ \ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\ \ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128445,\n\ \ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128445\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\ \ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\ \ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \ \ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\ \ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\ \ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\ \ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\ \ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5960832313341493,\n\ \ \"mc1_stderr\": 0.017177276822584284,\n \"mc2\": 0.7650119704368432,\n\ \ \"mc2_stderr\": 0.013830432253272741\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8468823993685872,\n \"acc_stderr\": 0.010120623252272958\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7134192570128886,\n \ \ \"acc_stderr\": 0.012454841668337697\n }\n}\n```" repo_url: https://huggingface.co/CorticalStack/neurotic-crown-clown-7b-ties leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|arc:challenge|25_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-02-23T22-57-05.919937.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|gsm8k|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hellaswag|10_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-23T22-57-05.919937.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-management|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-virology|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T22-57-05.919937.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|truthfulqa:mc|0_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-02-23T22-57-05.919937.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_02_23T22_57_05.919937 path: - '**/details_harness|winogrande|5_2024-02-23T22-57-05.919937.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-02-23T22-57-05.919937.parquet' - config_name: results data_files: - split: 2024_02_23T22_57_05.919937 path: - results_2024-02-23T22-57-05.919937.parquet - split: latest path: - results_2024-02-23T22-57-05.919937.parquet --- # Dataset Card for Evaluation run of CorticalStack/neurotic-crown-clown-7b-ties <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [CorticalStack/neurotic-crown-clown-7b-ties](https://huggingface.co/CorticalStack/neurotic-crown-clown-7b-ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CorticalStack__neurotic-crown-clown-7b-ties", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-23T22:57:05.919937](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__neurotic-crown-clown-7b-ties/blob/main/results_2024-02-23T22-57-05.919937.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6539276099234417, "acc_stderr": 0.032053817237704396, "acc_norm": 0.6530475659283497, "acc_norm_stderr": 0.03273053707433846, "mc1": 0.5960832313341493, "mc1_stderr": 0.017177276822584284, "mc2": 0.7650119704368432, "mc2_stderr": 0.013830432253272741 }, "harness|arc:challenge|25": { "acc": 0.7090443686006825, "acc_stderr": 0.01327307786590759, "acc_norm": 0.7235494880546075, "acc_norm_stderr": 0.013069662474252423 }, "harness|hellaswag|10": { "acc": 0.7000597490539733, "acc_stderr": 0.004572949924250631, "acc_norm": 0.8860784704242183, "acc_norm_stderr": 0.0031706661225176544 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6444444444444445, "acc_stderr": 0.04135176749720385, "acc_norm": 0.6444444444444445, "acc_norm_stderr": 0.04135176749720385 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7171052631578947, "acc_stderr": 0.03665349695640767, "acc_norm": 0.7171052631578947, "acc_norm_stderr": 0.03665349695640767 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6943396226415094, "acc_stderr": 0.028353298073322666, "acc_norm": 0.6943396226415094, "acc_norm_stderr": 0.028353298073322666 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03476590104304134, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.59, "acc_stderr": 0.04943110704237101, "acc_norm": 0.59, "acc_norm_stderr": 0.04943110704237101 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6589595375722543, "acc_stderr": 0.036146654241808254, "acc_norm": 0.6589595375722543, "acc_norm_stderr": 0.036146654241808254 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.43137254901960786, "acc_stderr": 0.04928099597287534, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.04928099597287534 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768077, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768077 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5659574468085107, "acc_stderr": 0.03240038086792747, "acc_norm": 0.5659574468085107, "acc_norm_stderr": 0.03240038086792747 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.47368421052631576, "acc_stderr": 0.046970851366478626, "acc_norm": 0.47368421052631576, "acc_norm_stderr": 0.046970851366478626 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5862068965517241, "acc_stderr": 0.04104269211806232, "acc_norm": 0.5862068965517241, "acc_norm_stderr": 0.04104269211806232 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41005291005291006, "acc_stderr": 0.025331202438944433, "acc_norm": 0.41005291005291006, "acc_norm_stderr": 0.025331202438944433 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.47619047619047616, "acc_stderr": 0.04467062628403273, "acc_norm": 0.47619047619047616, "acc_norm_stderr": 0.04467062628403273 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7774193548387097, "acc_stderr": 0.023664216671642514, "acc_norm": 0.7774193548387097, "acc_norm_stderr": 0.023664216671642514 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5123152709359606, "acc_stderr": 0.035169204442208966, "acc_norm": 0.5123152709359606, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7636363636363637, "acc_stderr": 0.03317505930009182, "acc_norm": 0.7636363636363637, "acc_norm_stderr": 0.03317505930009182 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.797979797979798, "acc_stderr": 0.02860620428922987, "acc_norm": 0.797979797979798, "acc_norm_stderr": 0.02860620428922987 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9067357512953368, "acc_stderr": 0.02098685459328973, "acc_norm": 0.9067357512953368, "acc_norm_stderr": 0.02098685459328973 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6692307692307692, "acc_stderr": 0.023854795680971125, "acc_norm": 0.6692307692307692, "acc_norm_stderr": 0.023854795680971125 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3111111111111111, "acc_stderr": 0.028226446749683512, "acc_norm": 0.3111111111111111, "acc_norm_stderr": 0.028226446749683512 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6932773109243697, "acc_stderr": 0.02995382389188704, "acc_norm": 0.6932773109243697, "acc_norm_stderr": 0.02995382389188704 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.37748344370860926, "acc_stderr": 0.0395802723112157, "acc_norm": 0.37748344370860926, "acc_norm_stderr": 0.0395802723112157 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8422018348623853, "acc_stderr": 0.01563002297009244, "acc_norm": 0.8422018348623853, "acc_norm_stderr": 0.01563002297009244 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5138888888888888, "acc_stderr": 0.03408655867977749, "acc_norm": 0.5138888888888888, "acc_norm_stderr": 0.03408655867977749 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8529411764705882, "acc_stderr": 0.024857478080250447, "acc_norm": 0.8529411764705882, "acc_norm_stderr": 0.024857478080250447 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.810126582278481, "acc_stderr": 0.02553010046023349, "acc_norm": 0.810126582278481, "acc_norm_stderr": 0.02553010046023349 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8091603053435115, "acc_stderr": 0.03446513350752598, "acc_norm": 0.8091603053435115, "acc_norm_stderr": 0.03446513350752598 }, "harness|hendrycksTest-international_law|5": { "acc": 0.768595041322314, "acc_stderr": 0.03849856098794088, "acc_norm": 0.768595041322314, "acc_norm_stderr": 0.03849856098794088 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7592592592592593, "acc_stderr": 0.04133119440243838, "acc_norm": 0.7592592592592593, "acc_norm_stderr": 0.04133119440243838 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7607361963190185, "acc_stderr": 0.0335195387952127, "acc_norm": 0.7607361963190185, "acc_norm_stderr": 0.0335195387952127 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4375, "acc_stderr": 0.04708567521880525, "acc_norm": 0.4375, "acc_norm_stderr": 0.04708567521880525 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8760683760683761, "acc_stderr": 0.021586494001281365, "acc_norm": 0.8760683760683761, "acc_norm_stderr": 0.021586494001281365 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8314176245210728, "acc_stderr": 0.013387895731543604, "acc_norm": 0.8314176245210728, "acc_norm_stderr": 0.013387895731543604 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7369942196531792, "acc_stderr": 0.023703099525258176, "acc_norm": 0.7369942196531792, "acc_norm_stderr": 0.023703099525258176 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4558659217877095, "acc_stderr": 0.016657229424586313, "acc_norm": 0.4558659217877095, "acc_norm_stderr": 0.016657229424586313 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7287581699346405, "acc_stderr": 0.02545775669666788, "acc_norm": 0.7287581699346405, "acc_norm_stderr": 0.02545775669666788 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7138263665594855, "acc_stderr": 0.02567025924218893, "acc_norm": 0.7138263665594855, "acc_norm_stderr": 0.02567025924218893 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7314814814814815, "acc_stderr": 0.02465968518596728, "acc_norm": 0.7314814814814815, "acc_norm_stderr": 0.02465968518596728 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.49645390070921985, "acc_stderr": 0.02982674915328092, "acc_norm": 0.49645390070921985, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4745762711864407, "acc_stderr": 0.012753716929101004, "acc_norm": 0.4745762711864407, "acc_norm_stderr": 0.012753716929101004 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6801470588235294, "acc_stderr": 0.02833295951403121, "acc_norm": 0.6801470588235294, "acc_norm_stderr": 0.02833295951403121 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6813725490196079, "acc_stderr": 0.01885008469646872, "acc_norm": 0.6813725490196079, "acc_norm_stderr": 0.01885008469646872 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.726530612244898, "acc_stderr": 0.028535560337128445, "acc_norm": 0.726530612244898, "acc_norm_stderr": 0.028535560337128445 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.026193923544454115, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.026193923544454115 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.0348735088019777, "acc_norm": 0.86, "acc_norm_stderr": 0.0348735088019777 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.5960832313341493, "mc1_stderr": 0.017177276822584284, "mc2": 0.7650119704368432, "mc2_stderr": 0.013830432253272741 }, "harness|winogrande|5": { "acc": 0.8468823993685872, "acc_stderr": 0.010120623252272958 }, "harness|gsm8k|5": { "acc": 0.7134192570128886, "acc_stderr": 0.012454841668337697 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
misza222/dreambooth-hackathon-Daphnia
--- dataset_info: features: - name: image dtype: image splits: - name: train num_bytes: 2288884.0 num_examples: 9 download_size: 2242120 dataset_size: 2288884.0 --- # Dataset Card for "dreambooth-hackathon-Daphnia" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
juancopi81/orca-math-word-problems-170034_180036
--- language: - en dataset_info: features: - name: question dtype: string - name: answer dtype: string splits: - name: train num_bytes: 15973562 num_examples: 10002 download_size: 6898844 dataset_size: 15973562 configs: - config_name: default data_files: - split: train path: data/train-* ---
RossVermouth/chensu_test_dataset
--- license: apache-2.0 task_categories: - image-classification language: - aa - ae tags: - not-for-all-audiences size_categories: - 1K<n<10K --- # Dataset Card for Dataset Name ## Dataset Description - **Homepage:** - **Repository:** - **Paper:** - **Leaderboard:** - **Point of Contact:** ### Dataset Summary just for test ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
vvtq/pedestrain_pose_caption_CUHK-P3_random_100
--- dataset_info: features: - name: image dtype: image - name: pose dtype: image - name: image_caption dtype: string splits: - name: train num_bytes: 2757288.0 num_examples: 100 download_size: 2651279 dataset_size: 2757288.0 --- # Dataset Card for "pedestrain_pose_caption_CUHK-P3_random_100" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
radlab/legal-mc4-pl
--- license: cc-by-4.0 task_categories: - text-generation language: - pl tags: - dataset - clm - mlm - polish - raw - jsonl pretty_name: Polish Legal MC4 size_categories: - 100M<n<1B --- `legal-mc4-pl` is extracted from [legal-mc4](https://huggingface.co/datasets/joelito/legal-mc4) multilang dataset. Originally `legal-mc4` is in _parquet_ format, into this dataset we provided `jsonl` format. This dataset contains only polish texts, thanks for [joelito](https://huggingface.co/joelito) for parquet `legal-mc4` dataset.
aiacademy131/new_drug_dataset
--- dataset_info: features: - name: page_namespace dtype: int64 - name: page_title dtype: string splits: - name: train num_bytes: 30695 num_examples: 1000 download_size: 21027 dataset_size: 30695 --- # Dataset Card for "new_drug_dataset" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Siki-77/amazon6_5core_polarity
--- license: apache-2.0 --- data source:https://cseweb.ucsd.edu/~jmcauley/datasets/amazon_v2/ we construct a new dataset Amazon reviews (Ni et al., 2019) on data aggregated over six genres 5core: beauty, fashion, appliances, giftcards, magazines, and software. cite: Jianmo Ni, Jiacheng Li, and Julian McAuley. Justifying recommendations using distantly-labeled reviews and fine-grained aspects. In Empirical Methods in Natural Language Processing and International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), 2019. URL https://www.aclweb.org/anthology/D19-1018.
MilanHrab/Kosice_test
--- dataset_info: features: - name: name_of_record dtype: string - name: speech_array sequence: float64 - name: sampling_rate dtype: int64 - name: label dtype: string splits: - name: train num_bytes: 294710140.4 num_examples: 1120 download_size: 223895398 dataset_size: 294710140.4 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "Kosice_test" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
yanyc/SciGraph
--- license: mit ---