datasetId
stringlengths
2
117
card
stringlengths
19
1.01M
Des1gn-1/vozm2
--- license: openrail ---
jglaser/pdb_protein_ligand_complexes
--- tags: - proteins - molecules - chemistry - SMILES - complex structures --- ## How to use the data sets This dataset contains about 36,000 unique pairs of protein sequences and ligand SMILES, and the coordinates of their complexes from the PDB. SMILES are assumed to be tokenized by the regex from P. Schwaller. ## Ligand selection criteria Only ligands - that have at least 3 atoms, - a molecular weight >= 100 Da, - and which are not among the 280 most common ligands in the PDB (this includes common additives like PEG, ADP, ..) are considered. ### Use the already preprocessed data Load a test/train split using ``` import pandas as pd train = pd.read_pickle('data/pdb_train.p') test = pd.read_pickle('data/pdb_test.p') ``` Receptor features contain protein frames and side chain angles in OpenFold/AlphaFold format. Ligand tokens which do not correspond to atoms have `nan` as their coordinates. Documentation by example: ``` >>> import pandas as pd >>> test = pd.read_pickle('data/pdb_test.p') >>> test.head(5) pdb_id lig_id ... ligand_xyz_2d ligand_bonds 0 7k38 VTY ... [(-2.031355975502858, -1.6316778784387098, 0.0... [(0, 1), (1, 4), (4, 5), (5, 10), (10, 9), (9,... 1 6prt OWA ... [(4.883261310160714, -0.37850716807626705, 0.0... [(11, 18), (18, 20), (20, 8), (8, 7), (7, 2), ... 2 4lxx FNF ... [(8.529427756002057, 2.2434809270065372, 0.0),... [(51, 49), (49, 48), (48, 46), (46, 53), (53, ... 3 4lxx FON ... [(-10.939694946697701, -1.1876214529096956, 0.... [(13, 1), (1, 0), (0, 3), (3, 4), (4, 7), (7, ... 4 7bp1 CAQ ... [(-1.9485571585149868, -1.499999999999999, 0.0... [(4, 3), (3, 1), (1, 0), (0, 7), (7, 9), (7, 6... [5 rows x 8 columns] >>> test.columns Index(['pdb_id', 'lig_id', 'seq', 'smiles', 'receptor_features', 'ligand_xyz', 'ligand_xyz_2d', 'ligand_bonds'], dtype='object') >>> test.iloc[0]['receptor_features'] {'rigidgroups_gt_frames': array([[[[-5.3122622e-01, 2.0922849e-01, -8.2098854e-01, 1.7295000e+01], [-7.1005428e-01, -6.3858479e-01, 2.9670244e-01, -9.1399997e-01], [-4.6219218e-01, 7.4056256e-01, 4.8779655e-01, 3.3284000e+01], [ 0.0000000e+00, 0.0000000e+00, 0.0000000e+00, 1.0000000e+00]], ... [[ 0.0000000e+00, 0.0000000e+00, 0.0000000e+00, -3.5030000e+00], [ 0.0000000e+00, 0.0000000e+00, 0.0000000e+00, 2.6764999e+01], [ 0.0000000e+00, 0.0000000e+00, 0.0000000e+00, 1.5136000e+01], [ 0.0000000e+00, 0.0000000e+00, 0.0000000e+00, 1.0000000e+00]]]], dtype=float32), 'torsion_angles_sin_cos': array([[[-1.90855725e-09, 3.58859784e-02], [ 1.55730803e-01, 9.87799530e-01], [ 6.05505241e-01, -7.95841312e-01], ..., [-2.92459433e-01, -9.56277928e-01], [ 9.96634814e-01, -8.19697779e-02], [ 0.00000000e+00, 0.00000000e+00]], ... [[ 2.96455977e-04, -9.99999953e-01], [-8.15660990e-01, 5.78530158e-01], [-3.17915569e-01, 9.48119024e-01], ..., [ 0.00000000e+00, 0.00000000e+00], [ 0.00000000e+00, 0.00000000e+00], [ 0.00000000e+00, 0.00000000e+00]]])} >>> test.iloc[0]['receptor_features'].keys() dict_keys(['rigidgroups_gt_frames', 'torsion_angles_sin_cos']) >>> test.iloc[0]['ligand_xyz'] [(22.289, 11.985, 9.225), (21.426, 11.623, 7.959), (nan, nan, nan), (nan, nan, nan), (21.797, 11.427, 6.574), (20.556, 11.56, 5.792), (nan, nan, nan), (20.507, 11.113, 4.552), (nan, nan, nan), (19.581, 10.97, 6.639), (20.107, 10.946, 7.954), (nan, nan, nan), (nan, nan, nan), (19.645, 10.364, 8.804)] ``` ### Manual update from PDB ``` # download the PDB archive into folder pdb/ sh rsync.sh 24 # number of parallel download processes # extract sequences and coordinates in parallel sbatch pdb.slurm # or mpirun -n 42 parse_complexes.py # desired number of tasks ```
ovior/twitter_dataset_1712993900
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 2715061 num_examples: 8259 download_size: 1548694 dataset_size: 2715061 configs: - config_name: default data_files: - split: train path: data/train-* ---
Yukang/LongAlpaca-16k-length
--- license: cc-by-nc-4.0 ---
AdapterOcean/code_instructions_standardized_cluster_16
--- dataset_info: features: - name: text dtype: string - name: conversation_id dtype: int64 - name: embedding sequence: float64 - name: cluster dtype: int64 splits: - name: train num_bytes: 82238736 num_examples: 8166 download_size: 23334034 dataset_size: 82238736 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "code_instructions_standardized_cluster_16" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
CyberHarem/cure_prism_hirogaruskyprecure
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of Cure Prism This is the dataset of Cure Prism, containing 200 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------| | raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 456 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. | | 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. | | 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 456 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 456 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-1200 | 456 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
marcus2000/polish_names
--- dataset_info: features: - name: '0' dtype: string - name: '1' dtype: string splits: - name: train num_bytes: 13695 num_examples: 572 - name: test num_bytes: 1549 num_examples: 64 download_size: 15128 dataset_size: 15244 --- # Dataset Card for "polish_names" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
incivility-UOH/TwitCivility
--- license: mit dataset_info: features: - name: text dtype: string - name: impoliteness dtype: int64 - name: intolerance dtype: int64 splits: - name: train num_bytes: 2169574.4014020115 num_examples: 10498 - name: test num_bytes: 542703.5985979884 num_examples: 2626 download_size: 1726706 dataset_size: 2712278 task_categories: - text-classification language: - en --- ## Overview The TwitCivility dataset is specifically developed to classify political incivility, focusing on multidimensional aspects of impoliteness and intolerance. Detailed methodologies are outlined in our [paper](https://arxiv.org/abs/2305.14964). ## Languages All text is written in English. ## Dataset Structure ### Data Fields We release TwitCivility as a data frame with the following fields: <br /> **text**: This field contains the text (after preprocessing and anonymization) of the tweet. <br /> **impoliteness**: A binary indicator (1 or 0) representing the presence of impoliteness in the text. A value of 1 signifies impoliteness, while 0 indicates non-impoliteness. <br /> **intolerance**: Similarly, this binary value denotes the presence of intolerance in the text, with 1 indicating intolerance and 0 signifying non-intolerance. <br /> ## Citation Information ``` @misc{incivility2023, title={Detecting Multidimensional Political Incivility on Social Media}, author={Sagi Pendzel and Nir Lotan and Alon Zoizner and Einat Minkov}, year={2023}, eprint={2305.14964}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
codeparrot/codeparrot-clean
--- tags: - python - code --- # CodeParrot 🦜 Dataset Cleaned ## What is it? A dataset of Python files from Github. This is the deduplicated version of the [codeparrot](https://huggingface.co/datasets/transformersbook/codeparrot). ## Processing The original dataset contains a lot of duplicated and noisy data. Therefore, the dataset was cleaned with the following steps: - Deduplication - Remove exact matches - Filtering - Average line length < 100 - Maximum line length < 1000 - Alpha numeric characters fraction > 0.25 - Remove auto-generated files (keyword search) For more details see the preprocessing script in the transformers repository [here](https://github.com/huggingface/transformers/tree/master/examples/research_projects/codeparrot). ## Splits The dataset is split in a [train](https://huggingface.co/datasets/codeparrot/codeparrot-clean-train) and [validation](https://huggingface.co/datasets/codeparrot/codeparrot-clean-valid) split used for training and evaluation. ## Structure This dataset has ~50GB of code and 5361373 files. ```python DatasetDict({ train: Dataset({ features: ['repo_name', 'path', 'copies', 'size', 'content', 'license', 'hash', 'line_mean', 'line_max', 'alpha_frac', 'autogenerated'], num_rows: 5361373 }) }) ```
junheesong/customdataset
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 865 num_examples: 6 download_size: 2518 dataset_size: 865 configs: - config_name: default data_files: - split: train path: data/train-* ---
SINAI/eSOL
--- license: cc-by-nc-sa-4.0 language: - es pretty_name: eSOL --- # eSOL - Lexicón de palabras indicadoras de opinión en español dependientes del dominio ## Descripción eSOL es una lista de palabras indicadoras de opinión en español dependientes del dominio. El dominio del conjunto de palabras es el de críticas de cine. Para la elaboración de la lista se ha seguido un enfoque basado en corpus. En este caso se ha seleccionado el corpus de críticas de cine en español [Spanish Movie Reviews](http://www.lsi.us.es/~fermin/corpusCine.zip). La lista está formada por 2.535 palabras positivas y 5.639 palabras negativas. Para más información sobre como se ha elaborado la lista puede consultar el artículo: [Semantic Orientation for Polarity Classification in Spanish Reviews](http://dx.doi.org/10.1016/j.eswa.2013.06.076). ## Cómo citar Molina-González M.D., Martínez-Cámara, E., Martín-Valdivia, M. T. & Perea-Ortega, J. M. (2012). Semantic orientation for polarity classification in Spanish reviews. Expert Systems with Applications. http://dx.doi.org/10.1016/j.eswa.2013.06.076 ``` @article{MOLINAGONZALEZ20137250, title = {Semantic orientation for polarity classification in Spanish reviews}, author = {M. Dolores Molina-González and Eugenio Martínez-Cámara and María-Teresa Martín-Valdivia and José M. Perea-Ortega}, journal = {Expert Systems with Applications}, volume = {40}, number = {18}, pages = {7250-7257}, year = {2013}, issn = {0957-4174}, doi = {https://doi.org/10.1016/j.eswa.2013.06.076}, url = {https://www.sciencedirect.com/science/article/pii/S0957417413004752}, keywords = {Sentiment polarity detection, Multilingual opinion mining, Spanish resources for sentiment analysis}, } ```
rohanbalkondekar/linux_commands
--- license: mit ---
pixta-ai/Plane-images-in-multiple-scenes
--- YAML tags: - copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging --- # Dataset Card for pixta-ai/Plane-images-in-multiple-scenes ## Dataset Description - **Homepage:** https://www.pixta.ai/ - **Repository:** - **Paper:** - **Leaderboard:** - **Point of Contact:** ### Dataset Summary 4,000 Plane images in multiple scenes, including multiple types of planes disproportionately, the passenger plan are the majorities. Each image contains from 1 to 10 visible planes For more details, please refer to the link: https://www.pixta.ai/ Or send your inquiries to contact@pixta.ai ### Supported Tasks and Leaderboards object-detection, computer-vision: The dataset can be used to train or enhance model for object detection. ### Languages English ### License Academic & commercial usage
maldv/crabcanon
--- language: - en pretty_name: "Crab Canon" tags: - knowledge - dialogue - book-data license: apache-2.0 task_categories: - question-answering --- # Dataset - crabcanon - **Developed by:** maldv - **License:** apache-2.0 - **Methodology:** Formatting book data by paragaph for training ## Description A crab canon (also known by the Latin form of the name, canon cancrizans; as well as retrograde canon, canon per recte et retro or canon per rectus et inversus) is an arrangement of two musical lines that are complementary and backward. If the two lines were placed next to each other (as opposed to stacked), the lines would form something conceptually similar to a palindrome. The name 'crab' refers to the fact that crabs are known to walk backward (although they can also walk forward and sideways). It originally referred to a kind of canon in which one line is played backward (e.g. FABACEAE played simultaneously with EAECABAF). An example is found in J. S. Bach's The Musical Offering, which also contains a table canon ("Quaerendo invenietis"), which combines retrogression with inversion by having one player turn the music upside down. Wikipedia Long form book text from archive.org is a pain to wrangle. This was an artifact of attempting to better understand the issues involved in managing this type of data for work on a tool to process books. This is data artifacts from the processing and preparing of the book *Gödel, Escher, Bach: an Eternal Golden Braid* by Douglas Hofstadter. A book which is both Pulitzer Prize winning and highly relevant to the field of AI. I have found this text imparts excellent logic skills, prose, and creativity; but does co-mingle dialogue and narritive sets which can be problematic.
CyberHarem/tamamo_no_mae_fgo
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of tamamo_no_mae/玉藻の前/玉藻前 (Fate/Grand Order) This is the dataset of tamamo_no_mae/玉藻の前/玉藻前 (Fate/Grand Order), containing 500 images and their tags. The core tags of this character are `fox_ears, pink_hair, animal_ears, yellow_eyes, fox_tail, long_hair, breasts, tail, large_breasts, fox_girl, animal_ear_fluff, bow, hair_bow, ribbon, hair_between_eyes, hair_ribbon, twintails`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 770.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tamamo_no_mae_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 1200 | 500 | 677.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tamamo_no_mae_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1250 | 1.29 GiB | [Download](https://huggingface.co/datasets/CyberHarem/tamamo_no_mae_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/tamamo_no_mae_fgo', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 25 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, blush, looking_at_viewer, official_alternate_costume, school_uniform, white_shirt, smile, blue_skirt, pleated_skirt, white_background, plaid_skirt, simple_background, bowtie, collarbone, jacket_around_waist, red_bow, open_mouth | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, black_thighhighs, blush, cleavage, collarbone, looking_at_viewer, official_alternate_costume, open_clothes, pink_bra, solo, striped_clothes, long_sleeves, fang, simple_background, smile, closed_mouth, scrunchie, sitting, white_background | | 2 | 10 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, blush, cleavage, looking_at_viewer, necklace, official_alternate_costume, open_clothes, pink_bra, solo, striped_clothes, collarbone, black_thighhighs, simple_background, open_mouth, white_background, long_sleeves, medium_breasts, shorts, striped_jacket, sitting, smile, striped_hoodie | | 3 | 28 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, bare_shoulders, detached_sleeves, japanese_clothes, solo, cleavage, looking_at_viewer, blue_thighhighs, medium_breasts, smile, open_mouth | | 4 | 17 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, bare_shoulders, blue_kimono, detached_sleeves, looking_at_viewer, solo, blue_ribbon, cleavage, smile, blue_thighhighs, simple_background, fox_shadow_puppet, white_background, collarbone | | 5 | 26 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, bare_shoulders, blue_kimono, detached_sleeves, solo, cleavage, looking_at_viewer, smile, detached_collar, blush, simple_background, white_background, blue_thighhighs, blue_bow, wide_sleeves, closed_mouth, obi, long_sleeves, sitting, fox_shadow_puppet | | 6 | 11 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, bare_shoulders, blue_kimono, cleavage, hair_ornament, off_shoulder, official_alternate_costume, solo, bell, looking_at_viewer, very_long_hair, long_sleeves, wide_sleeves, multiple_tails, blush, collarbone, smile | | 7 | 7 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, bare_shoulders, bell, cleavage, hair_ornament, looking_at_viewer, off_shoulder, official_alternate_costume, solo, smile, blue_kimono, collarbone, very_long_hair | | 8 | 12 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, detached_sleeves, mini_top_hat, official_alternate_costume, solo, bare_shoulders, black_thighhighs, cleavage, looking_at_viewer, smile, dress, open_mouth, blush, fang, frills, simple_background | | 9 | 8 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, looking_at_viewer, uchikake, veil, blush, solo, smile, fox_mask, official_alternate_costume, fang, oil-paper_umbrella, wataboushi, white_kimono, jingle_bell, wide_sleeves | | 10 | 12 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | 1girl, looking_at_viewer, maid_headdress, solo, blush, official_alternate_costume, simple_background, frills, smile, detached_sleeves, maid_apron, cleavage, enmaided, holding, white_background, black_dress, fox_shadow_puppet | | 11 | 5 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | 1girl, bare_shoulders, blush, cleavage, collarbone, looking_at_viewer, navel, simple_background, solo, bare_arms, black_bikini, closed_mouth, sidelocks, white_background, grey_background, groin, side-tie_bikini_bottom, smile, stomach, underboob, very_long_hair, arm_up, fang, front-tie_bikini_top, halterneck, standing, tail_raised, thighs, white_bow | | 12 | 7 | ![](samples/12/clu12-sample0.png) | ![](samples/12/clu12-sample1.png) | ![](samples/12/clu12-sample2.png) | ![](samples/12/clu12-sample3.png) | ![](samples/12/clu12-sample4.png) | 1girl, female_service_cap, necktie, police_hat, policewoman, solo, blue_shirt, looking_at_viewer, official_alternate_costume, open_mouth, pencil_skirt, white_gloves, belt, blue_skirt, handcuffs, miniskirt, smile, black_skirt, blue_bow, blue_ribbon, blush, fang | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | blush | looking_at_viewer | official_alternate_costume | school_uniform | white_shirt | smile | blue_skirt | pleated_skirt | white_background | plaid_skirt | simple_background | bowtie | collarbone | jacket_around_waist | red_bow | open_mouth | black_thighhighs | cleavage | open_clothes | pink_bra | striped_clothes | long_sleeves | fang | closed_mouth | scrunchie | sitting | necklace | medium_breasts | shorts | striped_jacket | striped_hoodie | bare_shoulders | detached_sleeves | japanese_clothes | blue_thighhighs | blue_kimono | blue_ribbon | fox_shadow_puppet | detached_collar | blue_bow | wide_sleeves | obi | hair_ornament | off_shoulder | bell | very_long_hair | multiple_tails | mini_top_hat | dress | frills | uchikake | veil | fox_mask | oil-paper_umbrella | wataboushi | white_kimono | jingle_bell | maid_headdress | maid_apron | enmaided | holding | black_dress | navel | bare_arms | black_bikini | sidelocks | grey_background | groin | side-tie_bikini_bottom | stomach | underboob | arm_up | front-tie_bikini_top | halterneck | standing | tail_raised | thighs | white_bow | female_service_cap | necktie | police_hat | policewoman | blue_shirt | pencil_skirt | white_gloves | belt | handcuffs | miniskirt | black_skirt | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-------|:--------|:--------------------|:-----------------------------|:-----------------|:--------------|:--------|:-------------|:----------------|:-------------------|:--------------|:--------------------|:---------|:-------------|:----------------------|:----------|:-------------|:-------------------|:-----------|:---------------|:-----------|:------------------|:---------------|:-------|:---------------|:------------|:----------|:-----------|:-----------------|:---------|:-----------------|:-----------------|:-----------------|:-------------------|:-------------------|:------------------|:--------------|:--------------|:--------------------|:------------------|:-----------|:---------------|:------|:----------------|:---------------|:-------|:-----------------|:-----------------|:---------------|:--------|:---------|:-----------|:-------|:-----------|:---------------------|:-------------|:---------------|:--------------|:-----------------|:-------------|:-----------|:----------|:--------------|:--------|:------------|:---------------|:------------|:------------------|:--------|:-------------------------|:----------|:------------|:---------|:-----------------------|:-------------|:-----------|:--------------|:---------|:------------|:---------------------|:----------|:-------------|:--------------|:-------------|:---------------|:---------------|:-------|:------------|:------------|:--------------| | 0 | 25 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | | | X | | | X | | X | | X | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 10 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | X | | | X | | | X | | X | | X | | | X | X | X | X | X | X | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 28 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | | X | | | | X | | | | | | | | | | X | | X | | | | | | | | | | X | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 17 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | | X | | | | X | | | X | | X | | X | | | | | X | | | | | | | | | | | | | | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 26 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | X | X | X | | | | X | | | X | | X | | | | | | | X | | | | X | | X | | X | | | | | | X | X | | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 11 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | X | X | X | | | X | | | | | | | X | | | | | X | | | | X | | | | | | | | | | X | | | | X | | | | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 7 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | X | | X | X | | | X | | | | | | | X | | | | | X | | | | | | | | | | | | | | X | | | | X | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 8 | 12 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | X | X | X | X | | | X | | | | | X | | | | | X | X | X | | | | | X | | | | | | | | | X | X | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 9 | 8 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | X | X | X | X | | | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 10 | 12 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | X | X | X | X | X | | | X | | | X | | X | | | | | | | X | | | | | | | | | | | | | | | X | | | | | X | | | | | | | | | | | | X | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 11 | 5 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | X | X | X | X | | | | X | | | X | | X | | X | | | | | X | | | | | X | X | | | | | | | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | 12 | 7 | ![](samples/12/clu12-sample0.png) | ![](samples/12/clu12-sample1.png) | ![](samples/12/clu12-sample2.png) | ![](samples/12/clu12-sample3.png) | ![](samples/12/clu12-sample4.png) | X | X | X | X | X | | | X | X | | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
DZN111/a
--- license: openrail ---
CyberHarem/unicorn_azurlane
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of unicorn/ユニコーン/独角兽 (Azur Lane) This is the dataset of unicorn/ユニコーン/独角兽 (Azur Lane), containing 500 images and their tags. The core tags of this character are `purple_hair, long_hair, bangs, purple_eyes, ahoge, very_long_hair, ribbon, hair_bun, single_hair_bun, hair_ribbon, single_side_bun, one_side_up, black_ribbon, bow, hair_ornament, hair_between_eyes`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 703.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/unicorn_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 380.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/unicorn_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1301 | 863.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/unicorn_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 612.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/unicorn_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1301 | 1.21 GiB | [Download](https://huggingface.co/datasets/CyberHarem/unicorn_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/unicorn_azurlane', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 19 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, bare_shoulders, blush, detached_sleeves, long_sleeves, looking_at_viewer, object_hug, solo, stuffed_animal, stuffed_winged_unicorn, white_background, white_dress, black_bow, simple_background, closed_mouth, sleeves_past_wrists, halterneck | | 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, blush, fingerless_gloves, solo, bare_shoulders, cosplay, looking_at_viewer, purple_dress, purple_gloves, black_thighhighs, elbow_gloves, star_(symbol), blue_dress, collarbone, starry_sky, night_sky, quad_tails, zettai_ryouiki, breasts, halterneck, parted_lips, smile | | 2 | 14 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, blush, bun_cover, china_dress, short_sleeves, solo, white_dress, looking_at_viewer, pelvic_curtain, white_thighhighs, wrist_cuffs, double_bun, stuffed_animal, stuffed_winged_unicorn, white_background, closed_mouth, small_breasts, gift_box, on_head, sitting, simple_background, cleavage_cutout, covered_navel, heart, holding_gift | | 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, blush, hair_scrunchie, holding_phone, long_sleeves, looking_at_viewer, low_twintails, pleated_skirt, serafuku, simple_background, smartphone, solo, white_pantyhose, x_hair_ornament, black_skirt, white_background, white_cardigan, black_sailor_collar, black_scrunchie, neckerchief, open_mouth, :d | | 4 | 6 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, backpack, black_skirt, blush, holding_phone, long_sleeves, looking_at_viewer, low_twintails, pleated_skirt, serafuku, smartphone, solo, white_cardigan, white_pantyhose, x_hair_ornament, hair_scrunchie, pink_neckerchief, anchor_symbol, simple_background, white_background, bag_charm, black_sailor_collar, sidelocks | | 5 | 11 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, backpack, hair_scrunchie, long_sleeves, looking_at_viewer, low_twintails, pleated_skirt, serafuku, solo, x_hair_ornament, blush, simple_background, white_background, white_pantyhose, black_skirt, white_cardigan, pink_neckerchief, black_scrunchie, black_sailor_collar, anchor_symbol, closed_mouth, collarbone, holding_strap, smile | | 6 | 7 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, black_sailor_collar, blush, hair_scrunchie, long_sleeves, looking_at_viewer, low_twintails, serafuku, solo, anchor_symbol, closed_mouth, pink_neckerchief, smile, white_cardigan, white_pantyhose, x_hair_ornament, black_scrunchie, black_skirt, pleated_skirt, sleeves_past_wrists | | 7 | 6 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, long_sleeves, solo, blush, hair_flower, holding, looking_at_viewer, obi, wide_sleeves, white_kimono, depth_of_field, floral_print, parted_lips, pink_flower, smile, white_background | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | blush | detached_sleeves | long_sleeves | looking_at_viewer | object_hug | solo | stuffed_animal | stuffed_winged_unicorn | white_background | white_dress | black_bow | simple_background | closed_mouth | sleeves_past_wrists | halterneck | fingerless_gloves | cosplay | purple_dress | purple_gloves | black_thighhighs | elbow_gloves | star_(symbol) | blue_dress | collarbone | starry_sky | night_sky | quad_tails | zettai_ryouiki | breasts | parted_lips | smile | bun_cover | china_dress | short_sleeves | pelvic_curtain | white_thighhighs | wrist_cuffs | double_bun | small_breasts | gift_box | on_head | sitting | cleavage_cutout | covered_navel | heart | holding_gift | hair_scrunchie | holding_phone | low_twintails | pleated_skirt | serafuku | smartphone | white_pantyhose | x_hair_ornament | black_skirt | white_cardigan | black_sailor_collar | black_scrunchie | neckerchief | open_mouth | :d | backpack | pink_neckerchief | anchor_symbol | bag_charm | sidelocks | holding_strap | hair_flower | holding | obi | wide_sleeves | white_kimono | depth_of_field | floral_print | pink_flower | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------|:-------------------|:---------------|:--------------------|:-------------|:-------|:-----------------|:-------------------------|:-------------------|:--------------|:------------|:--------------------|:---------------|:----------------------|:-------------|:--------------------|:----------|:---------------|:----------------|:-------------------|:---------------|:----------------|:-------------|:-------------|:-------------|:------------|:-------------|:-----------------|:----------|:--------------|:--------|:------------|:--------------|:----------------|:-----------------|:-------------------|:--------------|:-------------|:----------------|:-----------|:----------|:----------|:------------------|:----------------|:--------|:---------------|:-----------------|:----------------|:----------------|:----------------|:-----------|:-------------|:------------------|:------------------|:--------------|:-----------------|:----------------------|:------------------|:--------------|:-------------|:-----|:-----------|:-------------------|:----------------|:------------|:------------|:----------------|:--------------|:----------|:------|:---------------|:---------------|:-----------------|:---------------|:--------------| | 0 | 19 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | | | X | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 14 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | X | | | X | | X | X | X | X | X | | X | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | X | | X | X | | X | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | 4 | 6 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | X | | X | X | | X | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | X | X | X | X | X | | | | | | | | | | | 5 | 11 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | X | | X | X | | X | | | X | | | X | X | | | | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | X | | X | X | X | | X | X | X | X | X | X | | | | X | X | X | | | X | | | | | | | | | | 6 | 7 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | X | | X | X | | X | | | | | | | X | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | | X | X | X | | X | X | X | X | X | X | | | | | X | X | | | | | | | | | | | | | 7 | 6 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | | X | | X | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X |
Icannos/chess_studies
--- license: cc0-1.0 task_categories: - text-generation language: - en pretty_name: Annotated chess games and studies size_categories: - 1K<n<10K --- # Dataset Card for Dataset Name ## Dataset Description - **Point of Contact:** maxime.darrin@outlook.com ### Dataset Summary This dataset consists of annotated chess games and chess studies by humans. It has two subsets, the first one "lichess" consists of the top lichess studies scrapped from lichess.org. The "others" subset mainly consist of games from https://www.angelfire.com/games3/smartbridge/ ### Supported Tasks and Leaderboards It is intended from training chess text generative models. ### Languages The main language represented is english, although some other languages might be present in unsignificant amounts. ## Dataset Structure ### How to use: ```python from datasets import load_dataset import chess.pgn import io dataset = load_dataset("Icannos/chess_studies", "lichess", streaming=True) for d in dataset['train']: pgn = io.StringIO(d['text']) game = chess.pgn.read_game(pgn) print(game) break ``` ### Data Instances Example of annotated game / study from lichess. The annotations includes arrows and circles drawn on the board in addition to natural language commentaries and sometimes computer evaluation. ``` [Event "🇷🇺 Petrov Defense 🇷🇺: Nimzowitsch Attack"] [Site "https://lichess.org/study/OnPMlzHT/oG7xbZFE"] [Date "????.??.??"] [Round "?"] [White "?"] [Black "?"] [Result "*"] [Annotator "https://lichess.org/@/LeninPerez"] [ECO "C42"] [Opening "Russian Game: Nimzowitsch Attack"] [UTCDate "2021.02.11"] [UTCTime "00:54:33"] [Variant "Standard"] 1. e4 { Do you remember the movements from the previous chapter? I hope so, because you should do them now :D } 1... e5 { That's! } 2. Nf3 { And now? } 2... Nf6 { Great job! } 3. Nxe5 { You will find this frequently in your games with this defense. That is, the most common in move 3 is that the white player takes the pawn. How can you drive the white knight of e5? } 3... d6 { Very well! [%csl Re5][%cal Rd6e5] } 4. Nf3 { You know what you have to do now, right? } 4... Nxe4 { Excellent, you get the pawn back! The blue arrows represent all the options the white player has to play now. [%cal Bd2d3,Bd3d4,Bb1c3,Bd1e2] } 5. Nc3 { This is the Nimzowitsch Attack! Change the knights [%cal Re4c3,Rc3e4] } 5... Nxc3 6. dxc3 { The white player must deal with the doubled pawns on the c column Develop your bishop [%csl Gf8] } 6... Be7 7. Be3 { What would you play now? (Psst, your king is in the center) } 7... O-O 8. Qd2 { White wants the queenside castling Now you must take your knight to f3, what is the shortest route? [%csl Gc1,Gf6,Gb8][%cal Ge1c1] } 8... Nd7 { That's! } 9. O-O-O { This is really the Nimzowitsch Attack. White castles long to plan a battle of attacks on opposite flanks! Where should this knight go? [%csl Gd7] } 9... Nf6 10. Bd3 { Play 10.c5 [%csl Gc5][%cal Gc7c5] } 10... c5 { Very well! Now the white player wants to attack your king with the pawns on the queenside. You must play as I indicate with the arrows, that is, attack the weak point a2 and improve your towers. [%csl Ra2][%cal Ba8c8,Bf8e8,Yc8e6,Yd8a5] } * Process finished with exit code 0 ``` ### Data Fields The only field is "text". Each row contains exactly one game/pgn file in the text field. ### Data Splits A single train split. ## Dataset Creation ### Source Data #### Lichess studies The lichess studies consist of the first 10 pages of studies (ranked by stars) on lichess (https://lichess.org/study/all/popular). #### Others studies I relied mainly on the compilation built over the years on https://www.angelfire.com/games3/smartbridge/ and which consists of top player games. ## Other Known Limitations The annotations are mainly in english (although some are annotated in french). ## Citation information TO COME.
dipudl/hc3-and-gpt-wiki-intro-with-perplexity
--- dataset_info: features: - name: prompt dtype: string - name: text dtype: string - name: source dtype: string - name: label dtype: int64 - name: perplexity dtype: float64 splits: - name: train num_bytes: 396594042.354058 num_examples: 330344 - name: test num_bytes: 20925699.0 num_examples: 17387 download_size: 251965361 dataset_size: 417519741.354058 --- # Dataset Card for "hc3-and-gpt-wiki-intro-with-perplexity" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
im-Kitsch/minari_d4rl
--- license: apache-2.0 task_categories: - reinforcement-learning --- # transfer from d4rl dataset to minari dataset transfer scripts and validation are in transfer.py 1. clone the repo ``` $ git clone https://huggingface.co/datasets/im-Kitsch/minari_d4rl ``` 2. copy the file to minari root (default is ~/.minari) ``` mv minari_d4rl/datasets ~/.minari/datasets ``` # todo infos like `infos/qvel` are not saved since the interface is not stable yet and those infos cannot be read directly.
dmayhem93/self-critiquing-base-selected-900
--- dataset_info: features: - name: id dtype: string - name: split dtype: string - name: time dtype: float64 - name: labeler dtype: string - name: is_topic_based_summarization dtype: bool - name: prompt dtype: string - name: responses sequence: string - name: embedding sequence: float64 splits: - name: train num_bytes: 18994180 num_examples: 900 download_size: 13075326 dataset_size: 18994180 --- # Dataset Card for "self-critiquing-base-selected-900" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Nexdata/British_English_Speech_Data_by_Mobile_Phone
--- YAML tags: - copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging --- # Dataset Card for Nexdata/British_English_Speech_Data_by_Mobile_Phone ## Table of Contents - [Table of Contents](#table-of-contents) - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** https://www.nexdata.ai/datasets/950?source=Huggingface - **Repository:** - **Paper:** - **Leaderboard:** - **Point of Contact:** ### Dataset Summary 831 Hours–Mobile Telephony British English Speech Data, which is recorded by 1651 native British speakers. The recording contents cover many categories such as generic, interactive, in-car and smart home. The texts are manually proofreaded to ensure a high accuracy rate. The database matchs the Android system and IOS. For more details, please refer to the link: https://www.nexdata.ai/datasets/950?source=Huggingface ### Supported Tasks and Leaderboards automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR). ### Languages British English ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing ### Citation Information [More Information Needed] ### Contributions
roleplay4fun/chai-invalid-pairs-ss5-v1.0
--- dataset_info: features: - name: prompt dtype: string - name: chosen dtype: string - name: rejected dtype: string splits: - name: train num_bytes: 24774445 num_examples: 14516 - name: test num_bytes: 2767156 num_examples: 1615 download_size: 14826645 dataset_size: 27541601 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* ---
48xrf/rares
--- license: wtfpl ---
distilled-one-sec-cv12-each-chunk-uniq/chunk_37
--- dataset_info: features: - name: logits sequence: float32 - name: mfcc sequence: sequence: float64 splits: - name: train num_bytes: 959335024.0 num_examples: 186932 download_size: 979658799 dataset_size: 959335024.0 --- # Dataset Card for "chunk_37" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
ai2lumos/lumos_unified_plan_iterative
--- license: apache-2.0 task_categories: - text-generation - question-answering language: - en tags: - language-agent - maths - reasoning - question-answering - web-agent - planning size_categories: - 10K<n<100K --- # 🪄 Agent Lumos: Unified and Modular Training for Open-Source Language Agents <p align="center"> 🌐<a href="https://allenai.github.io/lumos">[Website]</a> &nbsp; 📝<a href="https://arxiv.org/abs/2311.05657">[Paper]</a> &nbsp; 🤗<a href="https://huggingface.co/datasets?sort=trending&search=ai2lumos">[Data]</a> &nbsp; 🤗<a href="https://huggingface.co/models?sort=trending&search=ai2lumos">[Model]</a> &nbsp; 🤗<a href="https://huggingface.co/spaces/ai2lumos/lumos_data_demo">[Demo]</a> &nbsp; </p> We introduce 🪄**Lumos**, Language Agents with **Unified** Formats, **Modular** Design, and **Open-Source** LLMs. **Lumos** unifies a suite of complex interactive tasks and achieves competitive performance with GPT-4/3.5-based and larger open-source agents. **Lumos** has following features: * 🧩 **Modular Architecture**: - 🧩 **Lumos** consists of planning, grounding, and execution modules built based on LLAMA-2-7B/13B and off-the-shelf APIs. - 🤗 **Lumos** utilizes a unified data format that encompasses multiple task types, thereby enabling the developed agent framework to conveniently support a range of interactive tasks. * 🌍 **Diverse Training Data**: - 🌍 **Lumos** is trained with ~56K diverse high-quality subgoal/action annotations from ground-truth reasoning steps in existing benchmarks with GPT-4. - ⚒️ **Lumos** data can be instrumental for future research in developing open-source agents for complex interactive tasks. * 🚀 **Competitive Performance**: - 🚀 **Lumos** is comparable or even beats **GPT-series** agents on web/complex QA tasks Mind2Web and HotpotQA, and **larger open agents** on math and multimodal tasks. - 🚀 **Lumos** exceeds contemporaneous agents that have been **fine-tuned** with in-domain HotpotQA, Mind2Web and ScienceQA annotations, such as **FiReAct**, **AgentLM**, and **AutoAct**. - 🚀 **Lumos** performs better than open agent baseline formulations including **chain-of-thoughts** and **integrated** training. - 🚀 **Lumos** surpasses larger open LLM agents and domain-specific agents on unseen tasks, WebShop and InterCode_SQL. ## Data Overview `lumos_unified_plan_iterative` is the data for training **planning** module on **maths**, **complex QA** and **web agent** tasks in **Lumos-Iterative (Lumos-I)** formulation. The source of the training annotation training data is shown below: | Task | Number | |---|---| |PRM800K|10000| |GSM8K|7473| |ASDiv|2305| |StrategyQA|1777| |Musique|17632| |Mind2Web|1009| |A-OKVQA|15941| ## Models Trained with the Data `lumos_unified_plan_iterative` is used to train the following models. |Model|Huggingface Repo| |---|---| |`lumos_unified_plan_iterative`| [🤗Huggingface Repo](https://huggingface.co/ai2lumos/lumos_unified_plan_iterative) | |`lumos_unified_plan_iterative-13B`| [🤗Huggingface Repo](https://huggingface.co/ai2lumos/lumos_unified_plan_iterative-13B) | ## Citation If you find this work is relevant with your research, please feel free to cite our work! ``` @article{yin2023lumos, title={Agent Lumos: Unified and Modular Training for Open-Source Language Agents}, author={Yin, Da and Brahman, Faeze and Ravichander, Abhilasha and Chandu, Khyathi and Chang, Kai-Wei and Choi, Yejin and Lin, Bill Yuchen}, journal={arXiv preprint arXiv:2311.05657}, year={2023} } ```
open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_Cyber_3.1_SFT
--- pretty_name: Evaluation run of LeroyDyer/Mixtral_AI_Cyber_3.1_SFT dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [LeroyDyer/Mixtral_AI_Cyber_3.1_SFT](https://huggingface.co/LeroyDyer/Mixtral_AI_Cyber_3.1_SFT)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_Cyber_3.1_SFT\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-03-29T20:40:02.578448](https://huggingface.co/datasets/open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_Cyber_3.1_SFT/blob/main/results_2024-03-29T20-40-02.578448.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6458266984382132,\n\ \ \"acc_stderr\": 0.03228792089743707,\n \"acc_norm\": 0.6475023903958916,\n\ \ \"acc_norm_stderr\": 0.03294593467581811,\n \"mc1\": 0.36107711138310894,\n\ \ \"mc1_stderr\": 0.016814312844836886,\n \"mc2\": 0.5274741718707786,\n\ \ \"mc2_stderr\": 0.015124094961851015\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5878839590443686,\n \"acc_stderr\": 0.014383915302225405,\n\ \ \"acc_norm\": 0.6186006825938567,\n \"acc_norm_stderr\": 0.014194389086685253\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6206930890260904,\n\ \ \"acc_stderr\": 0.004842229276915337,\n \"acc_norm\": 0.8131846245767775,\n\ \ \"acc_norm_stderr\": 0.0038896668378694504\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\ \ \"acc_stderr\": 0.041539484047423976,\n \"acc_norm\": 0.6370370370370371,\n\ \ \"acc_norm_stderr\": 0.041539484047423976\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\ \ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\ \ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \ \ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933713,\n\ \ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933713\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\ \ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n\ \ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \ \ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n\ \ \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\ \ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\ \ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\ \ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.04959859966384181,\n\ \ \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.04959859966384181\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\ \ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.03265019475033582,\n\ \ \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.03265019475033582\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\ \ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\ \ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\ \ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924006,\n \"\ acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924006\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n\ \ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \ \ \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181015,\n \"\ acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181015\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n \"\ acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\ : 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\ \ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218964,\n \"\ acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218964\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n\ \ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\ \ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113114,\n \ \ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113114\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \ \ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.423841059602649,\n \"acc_stderr\": 0.04034846678603397,\n \"acc_norm\"\ : 0.423841059602649,\n \"acc_norm_stderr\": 0.04034846678603397\n },\n\ \ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8330275229357799,\n\ \ \"acc_stderr\": 0.01599015488507339,\n \"acc_norm\": 0.8330275229357799,\n\ \ \"acc_norm_stderr\": 0.01599015488507339\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\ : {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n\ \ \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"\ acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233504,\n \ \ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233504\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\ \ \"acc_stderr\": 0.030636591348699813,\n \"acc_norm\": 0.7040358744394619,\n\ \ \"acc_norm_stderr\": 0.030636591348699813\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137296,\n\ \ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137296\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\ acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\ \ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\ \ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.034624199316156234,\n\ \ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.034624199316156234\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\ \ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\ \ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\ \ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\ \ \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n\ \ \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \ \ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n\ \ \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n\ \ \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n\ \ \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.47039106145251397,\n\ \ \"acc_stderr\": 0.01669315492738356,\n \"acc_norm\": 0.47039106145251397,\n\ \ \"acc_norm_stderr\": 0.01669315492738356\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958147,\n\ \ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958147\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\ \ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\ \ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n\ \ \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \ \ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44002607561929596,\n\ \ \"acc_stderr\": 0.012678037478574513,\n \"acc_norm\": 0.44002607561929596,\n\ \ \"acc_norm_stderr\": 0.012678037478574513\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n\ \ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6519607843137255,\n \"acc_stderr\": 0.019270998708223977,\n \ \ \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.019270998708223977\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\ \ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\ \ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128438,\n\ \ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128438\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\ \ \"acc_stderr\": 0.025196929874827072,\n \"acc_norm\": 0.8507462686567164,\n\ \ \"acc_norm_stderr\": 0.025196929874827072\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.88,\n \"acc_stderr\": 0.032659863237109066,\n \ \ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.032659863237109066\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\ \ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\ \ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.028782108105401705,\n\ \ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.028782108105401705\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36107711138310894,\n\ \ \"mc1_stderr\": 0.016814312844836886,\n \"mc2\": 0.5274741718707786,\n\ \ \"mc2_stderr\": 0.015124094961851015\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8018942383583267,\n \"acc_stderr\": 0.011201862744487045\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6141015921152388,\n \ \ \"acc_stderr\": 0.013409077471319175\n }\n}\n```" repo_url: https://huggingface.co/LeroyDyer/Mixtral_AI_Cyber_3.1_SFT leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|arc:challenge|25_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-03-29T20-40-02.578448.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|gsm8k|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hellaswag|10_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-40-02.578448.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-management|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-40-02.578448.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|truthfulqa:mc|0_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-03-29T20-40-02.578448.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_03_29T20_40_02.578448 path: - '**/details_harness|winogrande|5_2024-03-29T20-40-02.578448.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-03-29T20-40-02.578448.parquet' - config_name: results data_files: - split: 2024_03_29T20_40_02.578448 path: - results_2024-03-29T20-40-02.578448.parquet - split: latest path: - results_2024-03-29T20-40-02.578448.parquet --- # Dataset Card for Evaluation run of LeroyDyer/Mixtral_AI_Cyber_3.1_SFT <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [LeroyDyer/Mixtral_AI_Cyber_3.1_SFT](https://huggingface.co/LeroyDyer/Mixtral_AI_Cyber_3.1_SFT) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_Cyber_3.1_SFT", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-03-29T20:40:02.578448](https://huggingface.co/datasets/open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_Cyber_3.1_SFT/blob/main/results_2024-03-29T20-40-02.578448.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6458266984382132, "acc_stderr": 0.03228792089743707, "acc_norm": 0.6475023903958916, "acc_norm_stderr": 0.03294593467581811, "mc1": 0.36107711138310894, "mc1_stderr": 0.016814312844836886, "mc2": 0.5274741718707786, "mc2_stderr": 0.015124094961851015 }, "harness|arc:challenge|25": { "acc": 0.5878839590443686, "acc_stderr": 0.014383915302225405, "acc_norm": 0.6186006825938567, "acc_norm_stderr": 0.014194389086685253 }, "harness|hellaswag|10": { "acc": 0.6206930890260904, "acc_stderr": 0.004842229276915337, "acc_norm": 0.8131846245767775, "acc_norm_stderr": 0.0038896668378694504 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.041539484047423976, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.041539484047423976 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6776315789473685, "acc_stderr": 0.03803510248351585, "acc_norm": 0.6776315789473685, "acc_norm_stderr": 0.03803510248351585 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7094339622641509, "acc_stderr": 0.02794321998933713, "acc_norm": 0.7094339622641509, "acc_norm_stderr": 0.02794321998933713 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7083333333333334, "acc_stderr": 0.038009680605548594, "acc_norm": 0.7083333333333334, "acc_norm_stderr": 0.038009680605548594 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.56, "acc_stderr": 0.049888765156985884, "acc_norm": 0.56, "acc_norm_stderr": 0.049888765156985884 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.653179190751445, "acc_stderr": 0.036291466701596636, "acc_norm": 0.653179190751445, "acc_norm_stderr": 0.036291466701596636 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.46078431372549017, "acc_stderr": 0.04959859966384181, "acc_norm": 0.46078431372549017, "acc_norm_stderr": 0.04959859966384181 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.78, "acc_stderr": 0.04163331998932263, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932263 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5234042553191489, "acc_stderr": 0.03265019475033582, "acc_norm": 0.5234042553191489, "acc_norm_stderr": 0.03265019475033582 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.43859649122807015, "acc_stderr": 0.04668000738510455, "acc_norm": 0.43859649122807015, "acc_norm_stderr": 0.04668000738510455 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5586206896551724, "acc_stderr": 0.04137931034482757, "acc_norm": 0.5586206896551724, "acc_norm_stderr": 0.04137931034482757 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42063492063492064, "acc_stderr": 0.025424835086924006, "acc_norm": 0.42063492063492064, "acc_norm_stderr": 0.025424835086924006 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5, "acc_stderr": 0.04472135954999579, "acc_norm": 0.5, "acc_norm_stderr": 0.04472135954999579 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7741935483870968, "acc_stderr": 0.023785577884181015, "acc_norm": 0.7741935483870968, "acc_norm_stderr": 0.023785577884181015 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.47783251231527096, "acc_stderr": 0.03514528562175008, "acc_norm": 0.47783251231527096, "acc_norm_stderr": 0.03514528562175008 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.68, "acc_stderr": 0.04688261722621505, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.0328766675860349, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.0328766675860349 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8131313131313131, "acc_stderr": 0.027772533334218964, "acc_norm": 0.8131313131313131, "acc_norm_stderr": 0.027772533334218964 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8704663212435233, "acc_stderr": 0.024233532297758733, "acc_norm": 0.8704663212435233, "acc_norm_stderr": 0.024233532297758733 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6615384615384615, "acc_stderr": 0.023991500500313036, "acc_norm": 0.6615384615384615, "acc_norm_stderr": 0.023991500500313036 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34074074074074073, "acc_stderr": 0.02889774874113114, "acc_norm": 0.34074074074074073, "acc_norm_stderr": 0.02889774874113114 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6680672268907563, "acc_stderr": 0.03058869701378364, "acc_norm": 0.6680672268907563, "acc_norm_stderr": 0.03058869701378364 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.423841059602649, "acc_stderr": 0.04034846678603397, "acc_norm": 0.423841059602649, "acc_norm_stderr": 0.04034846678603397 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8330275229357799, "acc_stderr": 0.01599015488507339, "acc_norm": 0.8330275229357799, "acc_norm_stderr": 0.01599015488507339 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5185185185185185, "acc_stderr": 0.03407632093854051, "acc_norm": 0.5185185185185185, "acc_norm_stderr": 0.03407632093854051 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8382352941176471, "acc_stderr": 0.02584501798692692, "acc_norm": 0.8382352941176471, "acc_norm_stderr": 0.02584501798692692 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.810126582278481, "acc_stderr": 0.025530100460233504, "acc_norm": 0.810126582278481, "acc_norm_stderr": 0.025530100460233504 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7040358744394619, "acc_stderr": 0.030636591348699813, "acc_norm": 0.7040358744394619, "acc_norm_stderr": 0.030636591348699813 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7786259541984732, "acc_stderr": 0.036412970813137296, "acc_norm": 0.7786259541984732, "acc_norm_stderr": 0.036412970813137296 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7962962962962963, "acc_stderr": 0.03893542518824847, "acc_norm": 0.7962962962962963, "acc_norm_stderr": 0.03893542518824847 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7361963190184049, "acc_stderr": 0.034624199316156234, "acc_norm": 0.7361963190184049, "acc_norm_stderr": 0.034624199316156234 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5178571428571429, "acc_stderr": 0.047427623612430116, "acc_norm": 0.5178571428571429, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.8155339805825242, "acc_stderr": 0.03840423627288276, "acc_norm": 0.8155339805825242, "acc_norm_stderr": 0.03840423627288276 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8589743589743589, "acc_stderr": 0.022801382534597528, "acc_norm": 0.8589743589743589, "acc_norm_stderr": 0.022801382534597528 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.04560480215720684, "acc_norm": 0.71, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8084291187739464, "acc_stderr": 0.014072859310451949, "acc_norm": 0.8084291187739464, "acc_norm_stderr": 0.014072859310451949 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7138728323699421, "acc_stderr": 0.02433214677913413, "acc_norm": 0.7138728323699421, "acc_norm_stderr": 0.02433214677913413 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.47039106145251397, "acc_stderr": 0.01669315492738356, "acc_norm": 0.47039106145251397, "acc_norm_stderr": 0.01669315492738356 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7418300653594772, "acc_stderr": 0.025058503316958147, "acc_norm": 0.7418300653594772, "acc_norm_stderr": 0.025058503316958147 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7202572347266881, "acc_stderr": 0.02549425935069491, "acc_norm": 0.7202572347266881, "acc_norm_stderr": 0.02549425935069491 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7253086419753086, "acc_stderr": 0.024836057868294677, "acc_norm": 0.7253086419753086, "acc_norm_stderr": 0.024836057868294677 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.475177304964539, "acc_stderr": 0.02979071924382972, "acc_norm": 0.475177304964539, "acc_norm_stderr": 0.02979071924382972 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.44002607561929596, "acc_stderr": 0.012678037478574513, "acc_norm": 0.44002607561929596, "acc_norm_stderr": 0.012678037478574513 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6617647058823529, "acc_stderr": 0.028739328513983572, "acc_norm": 0.6617647058823529, "acc_norm_stderr": 0.028739328513983572 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6519607843137255, "acc_stderr": 0.019270998708223977, "acc_norm": 0.6519607843137255, "acc_norm_stderr": 0.019270998708223977 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6363636363636364, "acc_stderr": 0.046075820907199756, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.046075820907199756 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.726530612244898, "acc_stderr": 0.028535560337128438, "acc_norm": 0.726530612244898, "acc_norm_stderr": 0.028535560337128438 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8507462686567164, "acc_stderr": 0.025196929874827072, "acc_norm": 0.8507462686567164, "acc_norm_stderr": 0.025196929874827072 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.88, "acc_stderr": 0.032659863237109066, "acc_norm": 0.88, "acc_norm_stderr": 0.032659863237109066 }, "harness|hendrycksTest-virology|5": { "acc": 0.5180722891566265, "acc_stderr": 0.03889951252827216, "acc_norm": 0.5180722891566265, "acc_norm_stderr": 0.03889951252827216 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.028782108105401705, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.028782108105401705 }, "harness|truthfulqa:mc|0": { "mc1": 0.36107711138310894, "mc1_stderr": 0.016814312844836886, "mc2": 0.5274741718707786, "mc2_stderr": 0.015124094961851015 }, "harness|winogrande|5": { "acc": 0.8018942383583267, "acc_stderr": 0.011201862744487045 }, "harness|gsm8k|5": { "acc": 0.6141015921152388, "acc_stderr": 0.013409077471319175 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
bigscience-data/roots_ca_enriched_conllu_ancora_for_ml_training
--- language: ca license: cc-by-4.0 extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience Ethical Charter. The charter can be found at: https://hf.co/spaces/bigscience/ethical-charter' extra_gated_fields: I have read and agree to abide by the BigScience Ethical Charter: checkbox --- ROOTS Subset: roots_ca_enriched_conllu_ancora_for_ml_training # Enriched CONLLU Ancora for ML training - Dataset uid: `enriched_conllu_ancora_for_ml_training` ### Description This is an enriched version for Machine Learning purposes of the CONLLU adaptation of AnCora corpus . This version of the corpus was developed by BSC TeMU as part of the AINA project, and has been used to do multi-task learning for the Catalan language Spacy 3.0 models. ### Homepage https://zenodo.org/record/5036651 ### Licensing - cc-by-4.0: Creative Commons Attribution 4.0 International ### Speaker Locations - Spain ### Sizes - 0.0000 % of total - 0.0000 % of ca ### BigScience processing steps #### Filters applied to: ca - dedup_document - dedup_template_soft - filter_remove_empty_docs - filter_small_docs_bytes_1024
DynamicSuperb/SpeechTextMatching_LJSpeech
--- dataset_info: features: - name: file dtype: string - name: audio dtype: audio - name: text dtype: string - name: instruction dtype: string - name: label dtype: string - name: transcription dtype: string splits: - name: test num_bytes: 58054642.03053435 num_examples: 200 download_size: 57046234 dataset_size: 58054642.03053435 configs: - config_name: default data_files: - split: test path: data/test-* --- # Dataset Card for "speechTextMatching_LJSpeech" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
liyucheng/sharegpt-500
--- dataset_info: features: - name: id dtype: string - name: chat sequence: sequence: string splits: - name: train num_bytes: 2185076 num_examples: 575 download_size: 1065085 dataset_size: 2185076 --- # Dataset Card for "sharegpt-500" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
averageandyyy/combined_librispeech_self
--- dataset_info: features: - name: audio dtype: audio - name: transcript dtype: string splits: - name: train num_bytes: 6992473149.621 num_examples: 28539 - name: test num_bytes: 342449773.04 num_examples: 2620 download_size: 6782935012 dataset_size: 7334922922.661 --- # Dataset Card for "combined_librispeech_self" num_examples: 2620 (test) num_examples: 28539 (train)
Multimodal-Fatima/FGVC_Aircraft_test_facebook_opt_350m_Attributes_Caption_ns_3333_random
--- dataset_info: features: - name: id dtype: int64 - name: image dtype: image - name: prompt dtype: string - name: true_label dtype: string - name: prediction dtype: string - name: scores sequence: float64 splits: - name: fewshot_1_bs_16 num_bytes: 300147629.375 num_examples: 3333 - name: fewshot_3_bs_16 num_bytes: 301866570.375 num_examples: 3333 download_size: 595044400 dataset_size: 602014199.75 --- # Dataset Card for "FGVC_Aircraft_test_facebook_opt_350m_Attributes_Caption_ns_3333_random" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
shuyuej/mathdata_consistency
--- license: apache-2.0 ---
liuyanchen1015/MULTI_VALUE_sst2_our_us
--- dataset_info: features: - name: sentence dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: score dtype: int64 splits: - name: dev num_bytes: 1041 num_examples: 7 - name: test num_bytes: 2903 num_examples: 15 - name: train num_bytes: 35765 num_examples: 304 download_size: 21109 dataset_size: 39709 --- # Dataset Card for "MULTI_VALUE_sst2_our_us" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
alzoubi36/privaseer
--- license: gpl-3.0 dataset_info: features: - name: hash dtype: string - name: url dtype: string - name: text dtype: string - name: title dtype: string - name: id dtype: int64 splits: - name: train num_bytes: 17080868768 num_examples: 2180300 download_size: 8133175578 dataset_size: 17080868768 --- ## Privaseer Dataset Huggingface version of the [Privaseer](https://privaseer.ist.psu.edu/) dataset. <pre> @inproceedings{srinath-etal-2021-privacy, title = "Privacy at Scale: Introducing the {P}riva{S}eer Corpus of Web Privacy Policies", author = "Srinath, Mukund and Wilson, Shomir and Giles, C Lee", booktitle = "Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)", month = aug, year = "2021", address = "Online", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2021.acl-long.532", doi = "10.18653/v1/2021.acl-long.532", pages = "6829--6839", abstract = "Organisations disclose their privacy practices by posting privacy policies on their websites. Even though internet users often care about their digital privacy, they usually do not read privacy policies, since understanding them requires a significant investment of time and effort. Natural language processing has been used to create experimental tools to interpret privacy policies, but there has been a lack of large privacy policy corpora to facilitate the creation of large-scale semi-supervised and unsupervised models to interpret and simplify privacy policies. Thus, we present the PrivaSeer Corpus of 1,005,380 English language website privacy policies collected from the web. The number of unique websites represented in PrivaSeer is about ten times larger than the next largest public collection of web privacy policies, and it surpasses the aggregate of unique websites represented in all other publicly available privacy policy corpora combined. We describe a corpus creation pipeline with stages that include a web crawler, language detection, document classification, duplicate and near-duplicate removal, and content extraction. We employ an unsupervised topic modelling approach to investigate the contents of policy documents in the corpus and discuss the distribution of topics in privacy policies at web scale. We further investigate the relationship between privacy policy domain PageRanks and text features of the privacy policies. Finally, we use the corpus to pretrain PrivBERT, a transformer-based privacy policy language model, and obtain state of the art results on the data practice classification and question answering tasks.",} </pre>
saurabhkarn/embedding
--- license: other ---
BigTMiami/amazon_25M_50_000_condensed
--- dataset_info: features: - name: input_ids sequence: int32 - name: attention_mask sequence: int8 - name: labels sequence: int64 splits: - name: train num_bytes: 55191036 num_examples: 8277 - name: validation num_bytes: 5701140 num_examples: 855 download_size: 19447795 dataset_size: 60892176 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* ---
bhatvineet/mr_trial3
--- dataset_info: features: - name: audio dtype: audio - name: transcriptions sequence: string splits: - name: train num_bytes: 1009711800.042 num_examples: 4179 - name: test num_bytes: 359681461.83 num_examples: 1393 download_size: 1379902601 dataset_size: 1369393261.872 --- # Dataset Card for "mr_trial3" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
HydraLM/biology_dataset_list_dict
--- dataset_info: features: - name: conversations list: - name: input dtype: string - name: response dtype: string - name: conversation_id dtype: int64 splits: - name: train num_bytes: 59521695 num_examples: 19999 download_size: 28653719 dataset_size: 59521695 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "biology_dataset_list_dict" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
knowrohit07/know_sql
--- license: openrail --- please use the val ign file for training, its much cleaner. thanks :)
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-82000
--- dataset_info: features: - name: input_ids sequence: sequence: int32 - name: attention_mask sequence: sequence: int8 - name: labels sequence: sequence: int64 splits: - name: train num_bytes: 13336000 num_examples: 1000 download_size: 919300 dataset_size: 13336000 configs: - config_name: default data_files: - split: train path: data/train-* ---
BramVanroy/wikipedia_culturax_dutch
--- language: - nl size_categories: - 10B<n<100B task_categories: - text-generation - text2text-generation pretty_name: Filtered CulturaX + Wikipedia for Dutch dataset_info: - config_name: 100M features: - name: text dtype: string - name: url dtype: string - name: source dtype: string splits: - name: train num_bytes: 738455828.5851797 num_examples: 1018200 - name: test num_bytes: 7458534.414820259 num_examples: 10284 download_size: 411183119 dataset_size: 745914363.0 - config_name: 100k features: - name: text dtype: string - name: url dtype: string - name: source dtype: string splits: - name: train num_bytes: 745955.3074739829 num_examples: 1047 - name: test num_bytes: 7124.692526017029 num_examples: 10 download_size: 366788 dataset_size: 753080.0 - config_name: 10B features: - name: text dtype: string - name: url dtype: string - name: source dtype: string splits: - name: train num_bytes: 66539945646.34457 num_examples: 40176566 - name: test num_bytes: 105996030.65543362 num_examples: 64000 download_size: 42132184504 dataset_size: 66645941677.0 - config_name: 10M features: - name: text dtype: string - name: url dtype: string - name: source dtype: string splits: - name: train num_bytes: 76734151.72157606 num_examples: 139851 - name: test num_bytes: 774743.2784239326 num_examples: 1412 download_size: 37995388 dataset_size: 77508895.0 - config_name: 10k features: - name: text dtype: string - name: url dtype: string - name: source dtype: string splits: - name: train num_bytes: 72048.30379746835 num_examples: 78 - name: test num_bytes: 5896 num_examples: 1 download_size: 47197 dataset_size: 77944.30379746835 - config_name: 15B features: - name: text dtype: string - name: url dtype: string - name: source dtype: string splits: - name: train num_bytes: 99730049355.25276 num_examples: 59584123 - name: test num_bytes: 107121206.74724333 num_examples: 64000 download_size: 63139415312 dataset_size: 99837170562.0 - config_name: 1B features: - name: text dtype: string - name: url dtype: string - name: source dtype: string splits: - name: train num_bytes: 6797502496.392602 num_examples: 5102360 - name: test num_bytes: 68660322.60739774 num_examples: 51538 download_size: 4260450464 dataset_size: 6866162819.0 - config_name: 1M features: - name: text dtype: string - name: url dtype: string - name: source dtype: string splits: - name: train num_bytes: 7442665.619329753 num_examples: 10694 - name: test num_bytes: 75164.38067024625 num_examples: 108 download_size: 3845466 dataset_size: 7517830.0 - config_name: 20B features: - name: text dtype: string - name: url dtype: string - name: source dtype: string splits: - name: train num_bytes: 132920704365.75093 num_examples: 78991679 - name: test num_bytes: 107693939.24907027 num_examples: 64000 download_size: 84141456153 dataset_size: 133028398305.0 - config_name: 25B features: - name: text dtype: string - name: url dtype: string - name: source dtype: string splits: - name: train num_bytes: 166111586295.01904 num_examples: 98399236 - name: test num_bytes: 108040894.98094498 num_examples: 64000 download_size: 105147418131 dataset_size: 166219627190.0 - config_name: 30B features: - name: text dtype: string - name: url dtype: string - name: source dtype: string splits: - name: train num_bytes: 199302582477.5805 num_examples: 117806793 - name: test num_bytes: 108273597.41950662 num_examples: 64000 download_size: 126152714564 dataset_size: 199410856075.0 - config_name: 5B features: - name: text dtype: string - name: url dtype: string - name: source dtype: string splits: - name: train num_bytes: 33351938314.309906 num_examples: 20769009 - name: test num_bytes: 102774477.69009268 num_examples: 64000 download_size: 21119808690 dataset_size: 33454712792.0 configs: - config_name: 100M data_files: - split: train path: 100M/train-* - split: test path: 100M/test-* - config_name: 100k data_files: - split: train path: 100k/train-* - split: test path: 100k/test-* - config_name: 10B data_files: - split: train path: 10B/train-* - split: test path: 10B/test-* - config_name: 10M data_files: - split: train path: 10M/train-* - split: test path: 10M/test-* - config_name: 10k data_files: - split: train path: 10k/train-* - split: test path: 10k/test-* - config_name: 15B data_files: - split: train path: 15B/train-* - split: test path: 15B/test-* - config_name: 1B data_files: - split: train path: 1B/train-* - split: test path: 1B/test-* - config_name: 1M data_files: - split: train path: 1M/train-* - split: test path: 1M/test-* - config_name: 20B data_files: - split: train path: 20B/train-* - split: test path: 20B/test-* - config_name: 25B data_files: - split: train path: 25B/train-* - split: test path: 25B/test-* - config_name: 30B data_files: - split: train path: 30B/train-* - split: test path: 30B/test-* - config_name: 5B data_files: - split: train path: 5B/train-* - split: test path: 5B/test-* --- # Filtered CulturaX + Wikipedia for Dutch This is a combined and filtered version of [CulturaX](https://huggingface.co/datasets/uonlp/CulturaX) and [Wikipedia](https://huggingface.co/datasets/wikimedia/wikipedia), only including Dutch. It is intended for the training of LLMs. Different configs are available based on the number of tokens (see a section below with an overview). This can be useful if you want to know exactly how many tokens you have. Great for using as a streaming dataset, too. Tokens are counted as white-space tokens, so depending on your tokenizer, you'll likely end up with more tokens than indicated here. Every config also has a test set (for validation) of 1% the total size of the dataset, minimally 1 max. 64k samples (~16M tokens). Wikipedia and CulturaX were suffled before merging and the teset set creation was also shuffled. Priority is given to Wikipedia to prioritize knowledge-content, so the smaller configs will consist exclusively of Wikipedia and for the larger configs we augment with CulturaX. Every config builds further on the previous, so this means that every config contains the same data as the smaller ones and more HOWEVER their train/test splits are not the same, so test set of one config may overlap with samples for another training set. This is usually not a problem but just be aware that you do not train on one config's training set and test with another config's test set. ## Configs ### `10k` -- 79 samples -- 10,087 tokens - ratio_wikipedia: 100.00% - total_num_tokens: 10,087 - train_num_tokens: 9,205 - test_num_tokens: 882 - total_num_samples: 79 - train_num_samples: 78 - test_num_samples: 1 ### `100k` -- 1,057 samples -- 100,075 tokens - ratio_wikipedia: 100.00% - total_num_tokens: 100,075 - train_num_tokens: 98,044 - test_num_tokens: 2,031 - total_num_samples: 1,057 - train_num_samples: 1,047 - test_num_samples: 10 ### `1M` -- 10,802 samples -- 1,000,239 tokens - ratio_wikipedia: 100.00% - total_num_tokens: 1,000,239 - train_num_tokens: 991,119 - test_num_tokens: 9,120 - total_num_samples: 10,802 - train_num_samples: 10,694 - test_num_samples: 108 ### `10M` -- 141,263 samples -- 10,000,022 tokens - ratio_wikipedia: 100.00% - total_num_tokens: 10,000,022 - train_num_tokens: 9,874,772 - test_num_tokens: 125,250 - total_num_samples: 141,263 - train_num_samples: 139,851 - test_num_samples: 1,412 ### `100M` -- 1,028,484 samples -- 100,000,047 tokens - ratio_wikipedia: 100.00% - total_num_tokens: 100,000,047 - train_num_tokens: 99,013,372 - test_num_tokens: 986,675 - total_num_samples: 1,028,484 - train_num_samples: 1,018,200 - test_num_samples: 10,284 ### `1B` -- 5,153,898 samples -- 1,000,000,187 tokens - ratio_wikipedia: 61.21% - total_num_tokens: 1,000,000,187 - train_num_tokens: 989,990,190 - test_num_tokens: 10,009,997 - total_num_samples: 5,153,898 - train_num_samples: 5,102,360 - test_num_samples: 51,538 ### `5B` -- 20,833,009 samples -- 5,000,000,076 tokens - ratio_wikipedia: 25.35% - total_num_tokens: 5,000,000,076 - train_num_tokens: 4,984,493,654 - test_num_tokens: 15,506,422 - total_num_samples: 20,833,009 - train_num_samples: 20,769,009 - test_num_samples: 64,000 ### `10B` -- 40,240,566 samples -- 10,000,000,115 tokens - ratio_wikipedia: 18.41% - total_num_tokens: 10,000,000,115 - train_num_tokens: 9,984,156,828 - test_num_tokens: 15,843,287 - total_num_samples: 40,240,566 - train_num_samples: 40,176,566 - test_num_samples: 64,000 ### `15B` -- 59,648,123 samples -- 15,000,000,154 tokens - ratio_wikipedia: 15.98% - total_num_tokens: 15,000,000,154 - train_num_tokens: 14,983,970,518 - test_num_tokens: 16,029,636 - total_num_samples: 59,648,123 - train_num_samples: 59,584,123 - test_num_samples: 64,000 ### `20B` -- 79,055,679 samples -- 20,000,000,009 tokens - ratio_wikipedia: 14.75% - total_num_tokens: 20,000,000,009 - train_num_tokens: 19,983,799,357 - test_num_tokens: 16,200,652 - total_num_samples: 79,055,679 - train_num_samples: 78,991,679 - test_num_samples: 64,000 ### `25B` -- 98,463,236 samples -- 25,000,000,048 tokens - ratio_wikipedia: 14.00% - total_num_tokens: 25,000,000,048 - train_num_tokens: 24,983,765,326 - test_num_tokens: 16,234,722 - total_num_samples: 98,463,236 - train_num_samples: 98,399,236 - test_num_samples: 64,000 ### `30B` -- 117,870,793 samples -- 30,000,000,087 tokens - ratio_wikipedia: 13.50% - total_num_tokens: 30,000,000,087 - train_num_tokens: 29,983,707,932 - test_num_tokens: 16,292,155 - total_num_samples: 117,870,793 - train_num_samples: 117,806,793 - test_num_samples: 64,000 ## Filtering While CultruaX already has done a lot of filtering, some more filtering can be done to improve the quality of the corpus. These filters are described below. The baseline ratios (punctuation, uppercase, digits) were calculated on the SONAR-500 corpus (excluding WRPEA WRPED WRUEA WRUED WRUEB). **CulturaX**: - removed documents that contain the text "rechten voorbehouden" or "rights reserved" - remove document's whose URL contained "wikipedia.org" (because we include a cleaned version of Wikipedia ourselves) - removed documents that contain a "bad word" (see the section below) - removed documents that contain any non-latin characters. The idea is that "knowledge"-based information (e.g. original writing of a name) are allowed when the data comes from Wikipedia, but not from any other webcrawl, to avoid unsollicited noise. **CulturaX + Wikipedia**: - removed documents where ratio of punctuation marks vs. non-whitespace characters is higher than 0.2 - removed documents where ratio of uppercase vs. non-whitespace characters is higher than 0.22 - removed documents where ratio of digits vs. non-whitespace characters is higher than 0.16 - removed documents where the average token length is < 2 or > 20 ## Bad words ```python BAD_PHRASES_DOC_LEVEL = { # https://en.wikipedia.org/wiki/Dutch_profanity "achterlijk", "debiel", "downie", "idioot", "kankerlijer", "klere", "kolere", "minkukel", "pestkop", "pleuris", "pleuritis", "teringlijer", "tyfuslijer", "gadver", "getver", "godver", "godskolere", "godverork", "graftak", "kopvod", "verdomme", "anaalgeneraal", "bitch", "dikzak", "flikker", "fok", "fuck", "hoer", "klootzak", "klote", "kreng", "kringspiermusketier", "kut", "lamzak", "lul", "manwijf", "matennaai", "neuken", "neuker", "ouwehoer", "reet", "reetkever", "reetridder", "rotzak", "schijt", "shit", "slet", "slijmbal", "slons", "sodemieter", "stoephoer", "swaffel", "teef", "trut", "tut", "zak", "uilskuiken", "zeik", "bamivreter", "bosneger", "neger", "fransoos", "geitenneuker", "kaaskop", "kakker", "koelie", "lijp", "medelander", "mocro", "mof", "nikker", "poepchinees", "roetmop", "spaghettivreter", "loempiavouwer", "spanjool", "spleetoog", "tatta", "tokkie", "zandneger", "zwartzak", "halvezool", "kenau", "klootviool", "knuppel", "koekert", "koekwaus", "oelewapper", "smeerlap", "sukkel", "sul", "wappie", "wijf", "zooi", # xxx (a.o. https://gitlab.com/yhavinga/c4nlpreproc/-/blob/master/clean/badwords_ennl.py?ref_type=heads) "xxx", "anal", "blowjob", "buttplug", "cock", "cunt", "geil", "sex", # Standaardnederlands = seks, maybe we catch some porn or socialmedia sites with this misspelling "porn", # extra "nigger", "nigga", "hoerig", "klojo", } ``` ## Config details ## License information For CulturaX: https://huggingface.co/datasets/uonlp/CulturaX#license-information For Wikipedia: https://huggingface.co/datasets/wikimedia/wikipedia#licensing-information
0x22almostEvil/tatoeba-mt-llama-only
--- license: cc-by-2.0 task_categories: - translation language: - en - ru - de - uk - sv - sr - sl - ro - pt - pl - nl - it - hu - hr - fr - es - da - cs - ca - bg tags: - tatoeba - Translation pretty_name: tatoeba-mt-llama-only size_categories: - 1M<n<10M --- # Dataset Card for multilingual tatoeba translations with ~3M entries (llama supported languages only). ### Dataset Summary ~3M entries. Just more user-friendly version that combines all of the entries of original dataset in a single file (llama supported languages only): https://huggingface.co/datasets/Helsinki-NLP/tatoeba_mt
drewski/results
--- license: mit ---
AlienKevin/klee
--- license: cc0-1.0 ---
open-llm-leaderboard/details_PulsarAI__Chat-AYB-Platypus2-13B
--- pretty_name: Evaluation run of PulsarAI/Chat-AYB-Platypus2-13B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [PulsarAI/Chat-AYB-Platypus2-13B](https://huggingface.co/PulsarAI/Chat-AYB-Platypus2-13B)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PulsarAI__Chat-AYB-Platypus2-13B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-28T16:53:41.047162](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__Chat-AYB-Platypus2-13B/blob/main/results_2023-10-28T16-53-41.047162.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2752726510067114,\n\ \ \"em_stderr\": 0.0045741300617909856,\n \"f1\": 0.38116505872483314,\n\ \ \"f1_stderr\": 0.004403649120675284,\n \"acc\": 0.3936315988829403,\n\ \ \"acc_stderr\": 0.0083541228301978\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.2752726510067114,\n \"em_stderr\": 0.0045741300617909856,\n\ \ \"f1\": 0.38116505872483314,\n \"f1_stderr\": 0.004403649120675284\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.029567854435178165,\n \ \ \"acc_stderr\": 0.004665893134220814\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7576953433307024,\n \"acc_stderr\": 0.012042352526174785\n\ \ }\n}\n```" repo_url: https://huggingface.co/PulsarAI/Chat-AYB-Platypus2-13B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|arc:challenge|25_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-08T14-46-05.202813.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_28T16_53_41.047162 path: - '**/details_harness|drop|3_2023-10-28T16-53-41.047162.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-28T16-53-41.047162.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_28T16_53_41.047162 path: - '**/details_harness|gsm8k|5_2023-10-28T16-53-41.047162.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-28T16-53-41.047162.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hellaswag|10_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T14-46-05.202813.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_28T16_53_41.047162 path: - '**/details_harness|winogrande|5_2023-10-28T16-53-41.047162.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-28T16-53-41.047162.parquet' - config_name: results data_files: - split: 2023_10_08T14_46_05.202813 path: - results_2023-10-08T14-46-05.202813.parquet - split: 2023_10_28T16_53_41.047162 path: - results_2023-10-28T16-53-41.047162.parquet - split: latest path: - results_2023-10-28T16-53-41.047162.parquet --- # Dataset Card for Evaluation run of PulsarAI/Chat-AYB-Platypus2-13B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/PulsarAI/Chat-AYB-Platypus2-13B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [PulsarAI/Chat-AYB-Platypus2-13B](https://huggingface.co/PulsarAI/Chat-AYB-Platypus2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_PulsarAI__Chat-AYB-Platypus2-13B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-28T16:53:41.047162](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__Chat-AYB-Platypus2-13B/blob/main/results_2023-10-28T16-53-41.047162.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.2752726510067114, "em_stderr": 0.0045741300617909856, "f1": 0.38116505872483314, "f1_stderr": 0.004403649120675284, "acc": 0.3936315988829403, "acc_stderr": 0.0083541228301978 }, "harness|drop|3": { "em": 0.2752726510067114, "em_stderr": 0.0045741300617909856, "f1": 0.38116505872483314, "f1_stderr": 0.004403649120675284 }, "harness|gsm8k|5": { "acc": 0.029567854435178165, "acc_stderr": 0.004665893134220814 }, "harness|winogrande|5": { "acc": 0.7576953433307024, "acc_stderr": 0.012042352526174785 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
LukasSonn/DoxygenStrings-Short
--- license: apache-2.0 --- # Dataset Info C++ + Natural Description -> Doxygen Documentation This dataset was created for my bachelors thesis investigating how LLMs can be fine-tuned to generate doxygen documentation. It was created by using the “Source code analysis dataset” by Gelman, Banjo Obayomi, Jessica Moore und David Slater (doi: 10.1016/j.dib.2019.104712). The following SQL-Statement was used to grab raw data from the dataset: ``` SELECT * FROM all_data WHERE LENGTH(comment) < 350 AND LENGTH(comment) > 10 AND LENGTH(code) > 100 AND LENGTH(code) < 800 AND code NOT LIKE '%//%' AND code NOT LIKE '%/*%' AND code NOT LIKE '%*/%' AND filename LIKE '%.cpp%' LIMIT 12000 ``` After selecting the Data Code LLaMa Instruct 34B is tasked to combine the human-written description of the functionality with the function code into a Doxygen-Comment. Any results which included the sample doxygen string or no doxygen string at all where filtered from the set.
malhajar/OpenOrca-tr
--- dataset_info: features: - name: id dtype: string - name: system_prompt dtype: string - name: question dtype: string - name: response dtype: string - name: system_prompt-turkish dtype: string - name: question-turkish dtype: string - name: response-turkish dtype: string splits: - name: train num_bytes: 8500889145 num_examples: 2352811 download_size: 4792916697 dataset_size: 8500889145 configs: - config_name: default data_files: - split: train path: data/train-* size_categories: - 1M<n<10M license: mit task_categories: - text-classification - token-classification - table-question-answering - question-answering - zero-shot-classification - summarization - feature-extraction - text-generation - text2text-generation language: - tr --- # Dataset Card for "OpenOrca-tr" This Dataset is part of a series of datasets aimed at advancing Turkish LLM Developments by establishing rigid Turkish dataset collection to enhance the performance of LLM's Produced in the Turkish Language. malhajar/orca-tr is a translated version of the [`OpenOrca`]( https://huggingface.co/datasets/Open-Orca/OpenOrca) and is the first ever SFT dataset in the Turkish Language with more than 2M entries! **Translated by:** [`Mohamad Alhajar`](https://www.linkedin.com/in/muhammet-alhajar/) # Dataset Summary The OpenOrca dataset is a collection of augmented [FLAN Collection data](https://arxiv.org/abs/2301.13688). Currently ~1M GPT-4 completions, and ~3.2M GPT-3.5 completions. It is tabularized in alignment with the distributions presented in the ORCA paper and currently represents a partial completion of the full intended dataset, with ongoing generation to expand its scope. The data is primarily used for training and evaluation in the field of natural language processing. ### Usage To load the dataset, run: ```python from datasets import load_dataset ds = load_dataset("malhajar/OpenOrca-tr") ``` <a name="languages"></a> # Languages The language of the data is primarily Turkish. <a name="dataset-structure"></a> `# Citation ```bibtex @misc{OpenOrca, title = {OpenOrca: An Open Dataset of GPT Augmented FLAN Reasoning Traces}, author = {Wing Lian and Bleys Goodson and Eugene Pentland and Austin Cook and Chanvichet Vong and "Teknium"}, year = {2023}, publisher = {HuggingFace}, journal = {HuggingFace repository}, howpublished = {\url{https://https://huggingface.co/Open-Orca/OpenOrca}}, } ```
sanchit-gandhi/librispeech_asr_dummy_noise-noise
--- dataset_info: - config_name: validation-pub-noise features: - name: audio dtype: audio - name: text dtype: string - name: id dtype: string splits: - name: '40' num_bytes: 3708657.0 num_examples: 6 - name: '35' num_bytes: 3708657.0 num_examples: 6 - name: '30' num_bytes: 3708657.0 num_examples: 6 - name: '25' num_bytes: 3708657.0 num_examples: 6 - name: '20' num_bytes: 3708657.0 num_examples: 6 - name: '15' num_bytes: 3708657.0 num_examples: 6 - name: '10' num_bytes: 3708657.0 num_examples: 6 - name: '5' num_bytes: 3708657.0 num_examples: 6 - name: '0' num_bytes: 3708657.0 num_examples: 6 - name: minus5 num_bytes: 3708657.0 num_examples: 6 - name: minus10 num_bytes: 3708657.0 num_examples: 6 download_size: 23320628 dataset_size: 40795227.0 - config_name: validation-white-noise features: - name: audio dtype: audio - name: text dtype: string - name: id dtype: string splits: - name: '40' num_bytes: 3708657.0 num_examples: 6 - name: '35' num_bytes: 3708657.0 num_examples: 6 - name: '30' num_bytes: 3708657.0 num_examples: 6 - name: '25' num_bytes: 3708657.0 num_examples: 6 - name: '20' num_bytes: 3708657.0 num_examples: 6 - name: '15' num_bytes: 3708657.0 num_examples: 6 - name: '10' num_bytes: 3708657.0 num_examples: 6 - name: '5' num_bytes: 3708657.0 num_examples: 6 - name: '0' num_bytes: 3708657.0 num_examples: 6 - name: minus5 num_bytes: 3708657.0 num_examples: 6 - name: minus10 num_bytes: 3708657.0 num_examples: 6 download_size: 23568938 dataset_size: 40795227.0 configs: - config_name: validation-pub-noise data_files: - split: '40' path: validation-pub-noise/40-* - split: '35' path: validation-pub-noise/35-* - split: '30' path: validation-pub-noise/30-* - split: '25' path: validation-pub-noise/25-* - split: '20' path: validation-pub-noise/20-* - split: '15' path: validation-pub-noise/15-* - split: '10' path: validation-pub-noise/10-* - split: '5' path: validation-pub-noise/5-* - split: '0' path: validation-pub-noise/0-* - split: minus5 path: validation-pub-noise/minus5-* - split: minus10 path: validation-pub-noise/minus10-* - config_name: validation-white-noise data_files: - split: '40' path: validation-white-noise/40-* - split: '35' path: validation-white-noise/35-* - split: '30' path: validation-white-noise/30-* - split: '25' path: validation-white-noise/25-* - split: '20' path: validation-white-noise/20-* - split: '15' path: validation-white-noise/15-* - split: '10' path: validation-white-noise/10-* - split: '5' path: validation-white-noise/5-* - split: '0' path: validation-white-noise/0-* - split: minus5 path: validation-white-noise/minus5-* - split: minus10 path: validation-white-noise/minus10-* --- # Dataset Card for "librispeech_asr_dummy_noise-noise" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
CVasNLPExperiments/OxfordPets_test_eachadea_vicuna_7b_1.1_mode_T_SPECIFIC_A_ns_3669
--- dataset_info: features: - name: id dtype: int64 - name: prompt dtype: string - name: true_label dtype: string - name: prediction dtype: string splits: - name: fewshot_0_clip_tags_ViT_L_14_Attributes_ViT_L_14_text_davinci_003_full_clip_tags_ViT_L_14_simple_specific_rices num_bytes: 1601420 num_examples: 3669 download_size: 196649 dataset_size: 1601420 --- # Dataset Card for "OxfordPets_test_eachadea_vicuna_7b_1.1_mode_T_SPECIFIC_A_ns_3669" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
fursov/gec_ner
--- dataset_info: features: - name: tokens sequence: string - name: ner_tags sequence: int64 splits: - name: train num_bytes: 21496736.708623063 num_examples: 55538 - name: test num_bytes: 1548254.2913769358 num_examples: 4000 download_size: 4069528 dataset_size: 23044991.0 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* ---
open-llm-leaderboard/details_venkycs__zyte-v1-1.1B
--- pretty_name: Evaluation run of venkycs/zyte-v1-1.1B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [venkycs/zyte-v1-1.1B](https://huggingface.co/venkycs/zyte-v1-1.1B) on the [Open\ \ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_venkycs__zyte-v1-1.1B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-01-10T21:27:28.725730](https://huggingface.co/datasets/open-llm-leaderboard/details_venkycs__zyte-v1-1.1B/blob/main/results_2024-01-10T21-27-28.725730.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25348202199685704,\n\ \ \"acc_stderr\": 0.030566154341037797,\n \"acc_norm\": 0.25435724416392974,\n\ \ \"acc_norm_stderr\": 0.031318310521318005,\n \"mc1\": 0.2839657282741738,\n\ \ \"mc1_stderr\": 0.015785370858396736,\n \"mc2\": 0.42589514098170206,\n\ \ \"mc2_stderr\": 0.014717544653312008\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.34982935153583616,\n \"acc_stderr\": 0.013936809212158277,\n\ \ \"acc_norm\": 0.3728668941979522,\n \"acc_norm_stderr\": 0.014131176760131163\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4584744074885481,\n\ \ \"acc_stderr\": 0.0049725431277678755,\n \"acc_norm\": 0.6141206930890261,\n\ \ \"acc_norm_stderr\": 0.0048580740134439885\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \ \ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n\ \ \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n\ \ \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.2236842105263158,\n \"acc_stderr\": 0.03391160934343602,\n\ \ \"acc_norm\": 0.2236842105263158,\n \"acc_norm_stderr\": 0.03391160934343602\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.2,\n\ \ \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \ \ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.2339622641509434,\n \"acc_stderr\": 0.02605529690115292,\n\ \ \"acc_norm\": 0.2339622641509434,\n \"acc_norm_stderr\": 0.02605529690115292\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n\ \ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \ \ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \ \ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\ \ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \ \ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n\ \ \"acc_stderr\": 0.030631145539198823,\n \"acc_norm\": 0.2023121387283237,\n\ \ \"acc_norm_stderr\": 0.030631145539198823\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149351,\n\ \ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149351\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n\ \ \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\ \ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.16666666666666666,\n\ \ \"acc_stderr\": 0.03505859682597264,\n \"acc_norm\": 0.16666666666666666,\n\ \ \"acc_norm_stderr\": 0.03505859682597264\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.036646663372252565,\n\ \ \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.036646663372252565\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.23015873015873015,\n \"acc_stderr\": 0.02167921966369314,\n \"\ acc_norm\": 0.23015873015873015,\n \"acc_norm_stderr\": 0.02167921966369314\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.18253968253968253,\n\ \ \"acc_stderr\": 0.03455071019102148,\n \"acc_norm\": 0.18253968253968253,\n\ \ \"acc_norm_stderr\": 0.03455071019102148\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366255,\n \ \ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366255\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.2064516129032258,\n \"acc_stderr\": 0.02302589961718871,\n \"\ acc_norm\": 0.2064516129032258,\n \"acc_norm_stderr\": 0.02302589961718871\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.22167487684729065,\n \"acc_stderr\": 0.029225575892489617,\n \"\ acc_norm\": 0.22167487684729065,\n \"acc_norm_stderr\": 0.029225575892489617\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\ : 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n\ \ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.23737373737373738,\n \"acc_stderr\": 0.0303137105381989,\n \"\ acc_norm\": 0.23737373737373738,\n \"acc_norm_stderr\": 0.0303137105381989\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.030276909945178267,\n\ \ \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.030276909945178267\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.23333333333333334,\n \"acc_stderr\": 0.021444547301560483,\n\ \ \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.021444547301560483\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073838,\n \ \ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073838\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\ \ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473834,\n \"\ acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473834\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.23853211009174313,\n \"acc_stderr\": 0.01827257581023187,\n \"\ acc_norm\": 0.23853211009174313,\n \"acc_norm_stderr\": 0.01827257581023187\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.35648148148148145,\n \"acc_stderr\": 0.032664783315272714,\n \"\ acc_norm\": 0.35648148148148145,\n \"acc_norm_stderr\": 0.032664783315272714\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.23529411764705882,\n \"acc_stderr\": 0.02977177522814565,\n \"\ acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.02977177522814565\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.2911392405063291,\n \"acc_stderr\": 0.02957160106575337,\n \ \ \"acc_norm\": 0.2911392405063291,\n \"acc_norm_stderr\": 0.02957160106575337\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.336322869955157,\n\ \ \"acc_stderr\": 0.031708824268455005,\n \"acc_norm\": 0.336322869955157,\n\ \ \"acc_norm_stderr\": 0.031708824268455005\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n\ \ \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.24793388429752067,\n \"acc_stderr\": 0.039418975265163025,\n \"\ acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.039418975265163025\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2037037037037037,\n\ \ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.2037037037037037,\n\ \ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.25766871165644173,\n \"acc_stderr\": 0.03436150827846917,\n\ \ \"acc_norm\": 0.25766871165644173,\n \"acc_norm_stderr\": 0.03436150827846917\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\ \ \"acc_stderr\": 0.04287858751340455,\n \"acc_norm\": 0.2857142857142857,\n\ \ \"acc_norm_stderr\": 0.04287858751340455\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.20388349514563106,\n \"acc_stderr\": 0.0398913985953177,\n\ \ \"acc_norm\": 0.20388349514563106,\n \"acc_norm_stderr\": 0.0398913985953177\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2606837606837607,\n\ \ \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.2606837606837607,\n\ \ \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.29118773946360155,\n\ \ \"acc_stderr\": 0.016246087069701393,\n \"acc_norm\": 0.29118773946360155,\n\ \ \"acc_norm_stderr\": 0.016246087069701393\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399201,\n\ \ \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399201\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\ \ \"acc_stderr\": 0.014355911964767864,\n \"acc_norm\": 0.2435754189944134,\n\ \ \"acc_norm_stderr\": 0.014355911964767864\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.238562091503268,\n \"acc_stderr\": 0.02440439492808787,\n\ \ \"acc_norm\": 0.238562091503268,\n \"acc_norm_stderr\": 0.02440439492808787\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2604501607717042,\n\ \ \"acc_stderr\": 0.02492672322484554,\n \"acc_norm\": 0.2604501607717042,\n\ \ \"acc_norm_stderr\": 0.02492672322484554\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.024659685185967277,\n\ \ \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.024659685185967277\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.24468085106382978,\n \"acc_stderr\": 0.025645553622266733,\n \ \ \"acc_norm\": 0.24468085106382978,\n \"acc_norm_stderr\": 0.025645553622266733\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23728813559322035,\n\ \ \"acc_stderr\": 0.010865436690780281,\n \"acc_norm\": 0.23728813559322035,\n\ \ \"acc_norm_stderr\": 0.010865436690780281\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.22794117647058823,\n \"acc_stderr\": 0.025483081468029804,\n\ \ \"acc_norm\": 0.22794117647058823,\n \"acc_norm_stderr\": 0.025483081468029804\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.25980392156862747,\n \"acc_stderr\": 0.017740899509177795,\n \ \ \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.017740899509177795\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n\ \ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n\ \ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.16326530612244897,\n \"acc_stderr\": 0.02366169917709862,\n\ \ \"acc_norm\": 0.16326530612244897,\n \"acc_norm_stderr\": 0.02366169917709862\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.263681592039801,\n\ \ \"acc_stderr\": 0.031157150869355568,\n \"acc_norm\": 0.263681592039801,\n\ \ \"acc_norm_stderr\": 0.031157150869355568\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \ \ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3132530120481928,\n\ \ \"acc_stderr\": 0.03610805018031024,\n \"acc_norm\": 0.3132530120481928,\n\ \ \"acc_norm_stderr\": 0.03610805018031024\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.031885780176863984,\n\ \ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.031885780176863984\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2839657282741738,\n\ \ \"mc1_stderr\": 0.015785370858396736,\n \"mc2\": 0.42589514098170206,\n\ \ \"mc2_stderr\": 0.014717544653312008\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.6203630623520127,\n \"acc_stderr\": 0.013639245403711153\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.013646702047005308,\n \ \ \"acc_stderr\": 0.0031957470754808027\n }\n}\n```" repo_url: https://huggingface.co/venkycs/zyte-v1-1.1B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|arc:challenge|25_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|arc:challenge|25_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-01-10T21-27-28.725730.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|gsm8k|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|gsm8k|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hellaswag|10_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hellaswag|10_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-10T21-22-02.953307.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-10T21-27-28.725730.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-management|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-management|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-virology|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-virology|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T21-27-28.725730.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|truthfulqa:mc|0_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|truthfulqa:mc|0_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-01-10T21-27-28.725730.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_01_10T21_22_02.953307 path: - '**/details_harness|winogrande|5_2024-01-10T21-22-02.953307.parquet' - split: 2024_01_10T21_27_28.725730 path: - '**/details_harness|winogrande|5_2024-01-10T21-27-28.725730.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-01-10T21-27-28.725730.parquet' - config_name: results data_files: - split: 2024_01_10T21_22_02.953307 path: - results_2024-01-10T21-22-02.953307.parquet - split: 2024_01_10T21_27_28.725730 path: - results_2024-01-10T21-27-28.725730.parquet - split: latest path: - results_2024-01-10T21-27-28.725730.parquet --- # Dataset Card for Evaluation run of venkycs/zyte-v1-1.1B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [venkycs/zyte-v1-1.1B](https://huggingface.co/venkycs/zyte-v1-1.1B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_venkycs__zyte-v1-1.1B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-10T21:27:28.725730](https://huggingface.co/datasets/open-llm-leaderboard/details_venkycs__zyte-v1-1.1B/blob/main/results_2024-01-10T21-27-28.725730.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.25348202199685704, "acc_stderr": 0.030566154341037797, "acc_norm": 0.25435724416392974, "acc_norm_stderr": 0.031318310521318005, "mc1": 0.2839657282741738, "mc1_stderr": 0.015785370858396736, "mc2": 0.42589514098170206, "mc2_stderr": 0.014717544653312008 }, "harness|arc:challenge|25": { "acc": 0.34982935153583616, "acc_stderr": 0.013936809212158277, "acc_norm": 0.3728668941979522, "acc_norm_stderr": 0.014131176760131163 }, "harness|hellaswag|10": { "acc": 0.4584744074885481, "acc_stderr": 0.0049725431277678755, "acc_norm": 0.6141206930890261, "acc_norm_stderr": 0.0048580740134439885 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.24, "acc_stderr": 0.04292346959909284, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909284 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.23703703703703705, "acc_stderr": 0.03673731683969506, "acc_norm": 0.23703703703703705, "acc_norm_stderr": 0.03673731683969506 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.2236842105263158, "acc_stderr": 0.03391160934343602, "acc_norm": 0.2236842105263158, "acc_norm_stderr": 0.03391160934343602 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.2, "acc_stderr": 0.04020151261036845, "acc_norm": 0.2, "acc_norm_stderr": 0.04020151261036845 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.2339622641509434, "acc_stderr": 0.02605529690115292, "acc_norm": 0.2339622641509434, "acc_norm_stderr": 0.02605529690115292 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.25, "acc_stderr": 0.03621034121889507, "acc_norm": 0.25, "acc_norm_stderr": 0.03621034121889507 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.24, "acc_stderr": 0.04292346959909284, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909284 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.28, "acc_stderr": 0.045126085985421296, "acc_norm": 0.28, "acc_norm_stderr": 0.045126085985421296 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.2023121387283237, "acc_stderr": 0.030631145539198823, "acc_norm": 0.2023121387283237, "acc_norm_stderr": 0.030631145539198823 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.18627450980392157, "acc_stderr": 0.03873958714149351, "acc_norm": 0.18627450980392157, "acc_norm_stderr": 0.03873958714149351 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.26, "acc_stderr": 0.044084400227680794, "acc_norm": 0.26, "acc_norm_stderr": 0.044084400227680794 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.26382978723404255, "acc_stderr": 0.028809989854102973, "acc_norm": 0.26382978723404255, "acc_norm_stderr": 0.028809989854102973 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.16666666666666666, "acc_stderr": 0.03505859682597264, "acc_norm": 0.16666666666666666, "acc_norm_stderr": 0.03505859682597264 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2620689655172414, "acc_stderr": 0.036646663372252565, "acc_norm": 0.2620689655172414, "acc_norm_stderr": 0.036646663372252565 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.23015873015873015, "acc_stderr": 0.02167921966369314, "acc_norm": 0.23015873015873015, "acc_norm_stderr": 0.02167921966369314 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.18253968253968253, "acc_stderr": 0.03455071019102148, "acc_norm": 0.18253968253968253, "acc_norm_stderr": 0.03455071019102148 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.19, "acc_stderr": 0.039427724440366255, "acc_norm": 0.19, "acc_norm_stderr": 0.039427724440366255 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.2064516129032258, "acc_stderr": 0.02302589961718871, "acc_norm": 0.2064516129032258, "acc_norm_stderr": 0.02302589961718871 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.22167487684729065, "acc_stderr": 0.029225575892489617, "acc_norm": 0.22167487684729065, "acc_norm_stderr": 0.029225575892489617 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.24242424242424243, "acc_stderr": 0.03346409881055953, "acc_norm": 0.24242424242424243, "acc_norm_stderr": 0.03346409881055953 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.23737373737373738, "acc_stderr": 0.0303137105381989, "acc_norm": 0.23737373737373738, "acc_norm_stderr": 0.0303137105381989 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.22797927461139897, "acc_stderr": 0.030276909945178267, "acc_norm": 0.22797927461139897, "acc_norm_stderr": 0.030276909945178267 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.23333333333333334, "acc_stderr": 0.021444547301560483, "acc_norm": 0.23333333333333334, "acc_norm_stderr": 0.021444547301560483 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.26666666666666666, "acc_stderr": 0.026962424325073838, "acc_norm": 0.26666666666666666, "acc_norm_stderr": 0.026962424325073838 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.21008403361344538, "acc_stderr": 0.026461398717471874, "acc_norm": 0.21008403361344538, "acc_norm_stderr": 0.026461398717471874 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2052980132450331, "acc_stderr": 0.03297986648473834, "acc_norm": 0.2052980132450331, "acc_norm_stderr": 0.03297986648473834 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.23853211009174313, "acc_stderr": 0.01827257581023187, "acc_norm": 0.23853211009174313, "acc_norm_stderr": 0.01827257581023187 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.35648148148148145, "acc_stderr": 0.032664783315272714, "acc_norm": 0.35648148148148145, "acc_norm_stderr": 0.032664783315272714 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.23529411764705882, "acc_stderr": 0.02977177522814565, "acc_norm": 0.23529411764705882, "acc_norm_stderr": 0.02977177522814565 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.2911392405063291, "acc_stderr": 0.02957160106575337, "acc_norm": 0.2911392405063291, "acc_norm_stderr": 0.02957160106575337 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.336322869955157, "acc_stderr": 0.031708824268455005, "acc_norm": 0.336322869955157, "acc_norm_stderr": 0.031708824268455005 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.25190839694656486, "acc_stderr": 0.03807387116306086, "acc_norm": 0.25190839694656486, "acc_norm_stderr": 0.03807387116306086 }, "harness|hendrycksTest-international_law|5": { "acc": 0.24793388429752067, "acc_stderr": 0.039418975265163025, "acc_norm": 0.24793388429752067, "acc_norm_stderr": 0.039418975265163025 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.2037037037037037, "acc_stderr": 0.03893542518824847, "acc_norm": 0.2037037037037037, "acc_norm_stderr": 0.03893542518824847 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.25766871165644173, "acc_stderr": 0.03436150827846917, "acc_norm": 0.25766871165644173, "acc_norm_stderr": 0.03436150827846917 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.2857142857142857, "acc_stderr": 0.04287858751340455, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.04287858751340455 }, "harness|hendrycksTest-management|5": { "acc": 0.20388349514563106, "acc_stderr": 0.0398913985953177, "acc_norm": 0.20388349514563106, "acc_norm_stderr": 0.0398913985953177 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2606837606837607, "acc_stderr": 0.028760348956523414, "acc_norm": 0.2606837606837607, "acc_norm_stderr": 0.028760348956523414 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.29118773946360155, "acc_stderr": 0.016246087069701393, "acc_norm": 0.29118773946360155, "acc_norm_stderr": 0.016246087069701393 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.24566473988439305, "acc_stderr": 0.02317629820399201, "acc_norm": 0.24566473988439305, "acc_norm_stderr": 0.02317629820399201 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2435754189944134, "acc_stderr": 0.014355911964767864, "acc_norm": 0.2435754189944134, "acc_norm_stderr": 0.014355911964767864 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.238562091503268, "acc_stderr": 0.02440439492808787, "acc_norm": 0.238562091503268, "acc_norm_stderr": 0.02440439492808787 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.2604501607717042, "acc_stderr": 0.02492672322484554, "acc_norm": 0.2604501607717042, "acc_norm_stderr": 0.02492672322484554 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.26851851851851855, "acc_stderr": 0.024659685185967277, "acc_norm": 0.26851851851851855, "acc_norm_stderr": 0.024659685185967277 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.24468085106382978, "acc_stderr": 0.025645553622266733, "acc_norm": 0.24468085106382978, "acc_norm_stderr": 0.025645553622266733 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.23728813559322035, "acc_stderr": 0.010865436690780281, "acc_norm": 0.23728813559322035, "acc_norm_stderr": 0.010865436690780281 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.22794117647058823, "acc_stderr": 0.025483081468029804, "acc_norm": 0.22794117647058823, "acc_norm_stderr": 0.025483081468029804 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.25980392156862747, "acc_stderr": 0.017740899509177795, "acc_norm": 0.25980392156862747, "acc_norm_stderr": 0.017740899509177795 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.2909090909090909, "acc_stderr": 0.04350271442923243, "acc_norm": 0.2909090909090909, "acc_norm_stderr": 0.04350271442923243 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.16326530612244897, "acc_stderr": 0.02366169917709862, "acc_norm": 0.16326530612244897, "acc_norm_stderr": 0.02366169917709862 }, "harness|hendrycksTest-sociology|5": { "acc": 0.263681592039801, "acc_stderr": 0.031157150869355568, "acc_norm": 0.263681592039801, "acc_norm_stderr": 0.031157150869355568 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-virology|5": { "acc": 0.3132530120481928, "acc_stderr": 0.03610805018031024, "acc_norm": 0.3132530120481928, "acc_norm_stderr": 0.03610805018031024 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.2222222222222222, "acc_stderr": 0.031885780176863984, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.031885780176863984 }, "harness|truthfulqa:mc|0": { "mc1": 0.2839657282741738, "mc1_stderr": 0.015785370858396736, "mc2": 0.42589514098170206, "mc2_stderr": 0.014717544653312008 }, "harness|winogrande|5": { "acc": 0.6203630623520127, "acc_stderr": 0.013639245403711153 }, "harness|gsm8k|5": { "acc": 0.013646702047005308, "acc_stderr": 0.0031957470754808027 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
rafaaa2105/stella
--- license: apache-2.0 task_categories: - text-generation language: - pt pretty_name: Stella ---
unitxt/data
--- license: apache-2.0 ---
liuyanchen1015/MULTI_VALUE_mnli_mass_noun_plurals
--- dataset_info: features: - name: premise dtype: string - name: hypothesis dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: score dtype: int64 splits: - name: dev_matched num_bytes: 1177768 num_examples: 5019 - name: dev_mismatched num_bytes: 1288142 num_examples: 5274 - name: test_matched num_bytes: 1179658 num_examples: 5013 - name: test_mismatched num_bytes: 1329643 num_examples: 5482 - name: train num_bytes: 47564536 num_examples: 200815 download_size: 34086483 dataset_size: 52539747 --- # Dataset Card for "MULTI_VALUE_mnli_mass_noun_plurals" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
DIANO777/GANGPLANKDATASET
--- license: openrail ---
tiagoblima/nilc-masked-punctuation
--- dataset_info: features: - name: text dtype: string - name: label dtype: string - name: reference dtype: string splits: - name: train num_bytes: 376331 num_examples: 1236 download_size: 228368 dataset_size: 376331 --- # Dataset Card for "nilc-masked-punctuation" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
UnderstandLing/oasst1_pt_threads
--- license: apache-2.0 dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 8572263 num_examples: 9620 - name: validation num_bytes: 451708 num_examples: 503 download_size: 4528786 dataset_size: 9023971 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* ---
pravsels/ManimML_helblazer811_issues
--- dataset_info: features: - name: number dtype: int64 - name: content dtype: string - name: comments sequence: string splits: - name: train num_bytes: 47832 num_examples: 38 download_size: 19770 dataset_size: 47832 configs: - config_name: default data_files: - split: train path: data/train-* ---
yentinglin/Taiwan-Bench
--- license: apache-2.0 task_categories: - table-question-answering - question-answering - text-generation language: - zh size_categories: - 1K<n<10K pretty_name: TWMTBench data_files: - split: test path: Taiwan-MT-Bench.jsonl --- <img src="https://cdn-uploads.huggingface.co/production/uploads/5df9c78eda6d0311fd3d541f/CmusIT5OlSXvFrbTJ7l-C.png" alt="Taiwan LLM Logo" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/> ## Performance ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5df9c78eda6d0311fd3d541f/HTwIzw6RDha2-PhuWqSuI.png) ## Citation If you find Taiwan LLM is useful in your work, please cite it with: ``` @misc{zheng2023judging, title={Judging LLM-as-a-judge with MT-Bench and Chatbot Arena}, author={Lianmin Zheng and Wei-Lin Chiang and Ying Sheng and Siyuan Zhuang and Zhanghao Wu and Yonghao Zhuang and Zi Lin and Zhuohan Li and Dacheng Li and Eric. P Xing and Hao Zhang and Joseph E. Gonzalez and Ion Stoica}, year={2023}, eprint={2306.05685}, archivePrefix={arXiv}, primaryClass={cs.CL} } @misc{lin2023taiwan, title={Taiwan LLM: Bridging the Linguistic Divide with a Culturally Aligned Language Model}, author={Yen-Ting Lin and Yun-Nung Chen}, year={2023}, eprint={2311.17487}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
ramo6627/open-australian-legal-qa-gemma-formatted-2k
--- dataset_info: features: - name: qa dtype: string splits: - name: train num_bytes: 1714179 num_examples: 2000 download_size: 727504 dataset_size: 1714179 configs: - config_name: default data_files: - split: train path: data/train-* ---
metaeval/counterfactually-augmented-snli
--- license: unknown task_categories: - text-classification language: - en --- ```bib @article{kaushik2020learning, title={Learning the Difference that Makes a Difference with Counterfactually Augmented Data}, author={Kaushik, Divyansh and Hovy, Eduard and Lipton, Zachary C}, journal={International Conference on Learning Representations (ICLR)}, year={2020} } ```
DucHaiten/anime-SDXL
--- license: creativeml-openrail-m ---
CyberHarem/mujina
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of Mujina/ムジナ/貉 (SSSS.DYNAZENON) This is the dataset of Mujina/ムジナ/貉 (SSSS.DYNAZENON), containing 234 images and their tags. The core tags of this character are `short_hair, blue_eyes, breasts, bangs, large_breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 234 | 308.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mujina/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 234 | 162.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mujina/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 580 | 351.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mujina/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 234 | 269.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mujina/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 580 | 506.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mujina/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/mujina', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, cleavage, looking_at_viewer, navel, purple_bikini, solo, wet, parted_lips, sitting, white_background, armpits, brown_hair | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, cleavage, collarbone, purple_bikini, thighs, bare_shoulders, blush, looking_at_viewer, sitting, solo, crossed_legs, navel, pink_hair, parted_lips, wet | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, ass, looking_at_viewer, looking_back, solo, thighs, purple_bikini, simple_background, white_background, bare_shoulders, from_behind, brown_hair, from_below, parted_lips | | 3 | 79 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, solo, white_gloves, white_jacket, purple_shorts, thighs, corset, looking_at_viewer, military_jacket, short_necktie, underbust, open_clothes, purple_necktie, white_shirt, short_shorts, white_background | | 4 | 20 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, 1boy, blush, hetero, pussy, sweat, sex, nipples, vaginal, completely_nude, mosaic_censoring, open_mouth, penis, looking_at_viewer, solo_focus, spread_legs, thighs, navel, anus, ass, brown_hair, hair_between_eyes, lying | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | looking_at_viewer | navel | purple_bikini | solo | wet | parted_lips | sitting | white_background | armpits | brown_hair | collarbone | thighs | bare_shoulders | blush | crossed_legs | pink_hair | ass | looking_back | simple_background | from_behind | from_below | white_gloves | white_jacket | purple_shorts | corset | military_jacket | short_necktie | underbust | open_clothes | purple_necktie | white_shirt | short_shorts | 1boy | hetero | pussy | sweat | sex | nipples | vaginal | completely_nude | mosaic_censoring | open_mouth | penis | solo_focus | spread_legs | anus | hair_between_eyes | lying | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:--------------------|:--------|:----------------|:-------|:------|:--------------|:----------|:-------------------|:----------|:-------------|:-------------|:---------|:-----------------|:--------|:---------------|:------------|:------|:---------------|:--------------------|:--------------|:-------------|:---------------|:---------------|:----------------|:---------|:------------------|:----------------|:------------|:---------------|:-----------------|:--------------|:---------------|:-------|:---------|:--------|:--------|:------|:----------|:----------|:------------------|:-------------------|:-------------|:--------|:-------------|:--------------|:-------|:--------------------|:--------| | 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | X | X | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | X | | X | X | | X | | X | | X | | X | X | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 79 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | X | | | X | | | | X | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | 4 | 20 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | X | X | | | | | | | | X | | X | | X | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
Cohere/miracl-fa-corpus-22-12
--- annotations_creators: - expert-generated language: - fa multilinguality: - multilingual size_categories: [] source_datasets: [] tags: [] task_categories: - text-retrieval license: - apache-2.0 task_ids: - document-retrieval --- # MIRACL (fa) embedded with cohere.ai `multilingual-22-12` encoder We encoded the [MIRACL dataset](https://huggingface.co/miracl) using the [cohere.ai](https://txt.cohere.ai/multilingual/) `multilingual-22-12` embedding model. The query embeddings can be found in [Cohere/miracl-fa-queries-22-12](https://huggingface.co/datasets/Cohere/miracl-fa-queries-22-12) and the corpus embeddings can be found in [Cohere/miracl-fa-corpus-22-12](https://huggingface.co/datasets/Cohere/miracl-fa-corpus-22-12). For the orginal datasets, see [miracl/miracl](https://huggingface.co/datasets/miracl/miracl) and [miracl/miracl-corpus](https://huggingface.co/datasets/miracl/miracl-corpus). Dataset info: > MIRACL 🌍🙌🌏 (Multilingual Information Retrieval Across a Continuum of Languages) is a multilingual retrieval dataset that focuses on search across 18 different languages, which collectively encompass over three billion native speakers around the world. > > The corpus for each language is prepared from a Wikipedia dump, where we keep only the plain text and discard images, tables, etc. Each article is segmented into multiple passages using WikiExtractor based on natural discourse units (e.g., `\n\n` in the wiki markup). Each of these passages comprises a "document" or unit of retrieval. We preserve the Wikipedia article title of each passage. ## Embeddings We compute for `title+" "+text` the embeddings using our `multilingual-22-12` embedding model, a state-of-the-art model that works for semantic search in 100 languages. If you want to learn more about this model, have a look at [cohere.ai multilingual embedding model](https://txt.cohere.ai/multilingual/). ## Loading the dataset In [miracl-fa-corpus-22-12](https://huggingface.co/datasets/Cohere/miracl-fa-corpus-22-12) we provide the corpus embeddings. Note, depending on the selected split, the respective files can be quite large. You can either load the dataset like this: ```python from datasets import load_dataset docs = load_dataset(f"Cohere/miracl-fa-corpus-22-12", split="train") ``` Or you can also stream it without downloading it before: ```python from datasets import load_dataset docs = load_dataset(f"Cohere/miracl-fa-corpus-22-12", split="train", streaming=True) for doc in docs: docid = doc['docid'] title = doc['title'] text = doc['text'] emb = doc['emb'] ``` ## Search Have a look at [miracl-fa-queries-22-12](https://huggingface.co/datasets/Cohere/miracl-fa-queries-22-12) where we provide the query embeddings for the MIRACL dataset. To search in the documents, you must use **dot-product**. And then compare this query embeddings either with a vector database (recommended) or directly computing the dot product. A full search example: ```python # Attention! For large datasets, this requires a lot of memory to store # all document embeddings and to compute the dot product scores. # Only use this for smaller datasets. For large datasets, use a vector DB from datasets import load_dataset import torch #Load documents + embeddings docs = load_dataset(f"Cohere/miracl-fa-corpus-22-12", split="train") doc_embeddings = torch.tensor(docs['emb']) # Load queries queries = load_dataset(f"Cohere/miracl-fa-queries-22-12", split="dev") # Select the first query as example qid = 0 query = queries[qid] query_embedding = torch.tensor(queries['emb']) # Compute dot score between query embedding and document embeddings dot_scores = torch.mm(query_embedding, doc_embeddings.transpose(0, 1)) top_k = torch.topk(dot_scores, k=3) # Print results print("Query:", query['query']) for doc_id in top_k.indices[0].tolist(): print(docs[doc_id]['title']) print(docs[doc_id]['text']) ``` You can get embeddings for new queries using our API: ```python #Run: pip install cohere import cohere co = cohere.Client(f"{api_key}") # You should add your cohere API Key here :)) texts = ['my search query'] response = co.embed(texts=texts, model='multilingual-22-12') query_embedding = response.embeddings[0] # Get the embedding for the first text ``` ## Performance In the following table we compare the cohere multilingual-22-12 model with Elasticsearch version 8.6.0 lexical search (title and passage indexed as independent fields). Note that Elasticsearch doesn't support all languages that are part of the MIRACL dataset. We compute nDCG@10 (a ranking based loss), as well as hit@3: Is at least one relevant document in the top-3 results. We find that hit@3 is easier to interpret, as it presents the number of queries for which a relevant document is found among the top-3 results. Note: MIRACL only annotated a small fraction of passages (10 per query) for relevancy. Especially for larger Wikipedias (like English), we often found many more relevant passages. This is know as annotation holes. Real nDCG@10 and hit@3 performance is likely higher than depicted. | Model | cohere multilingual-22-12 nDCG@10 | cohere multilingual-22-12 hit@3 | ES 8.6.0 nDCG@10 | ES 8.6.0 acc@3 | |---|---|---|---|---| | miracl-ar | 64.2 | 75.2 | 46.8 | 56.2 | | miracl-bn | 61.5 | 75.7 | 49.2 | 60.1 | | miracl-de | 44.4 | 60.7 | 19.6 | 29.8 | | miracl-en | 44.6 | 62.2 | 30.2 | 43.2 | | miracl-es | 47.0 | 74.1 | 27.0 | 47.2 | | miracl-fi | 63.7 | 76.2 | 51.4 | 61.6 | | miracl-fr | 46.8 | 57.1 | 17.0 | 21.6 | | miracl-hi | 50.7 | 62.9 | 41.0 | 48.9 | | miracl-id | 44.8 | 63.8 | 39.2 | 54.7 | | miracl-ru | 49.2 | 66.9 | 25.4 | 36.7 | | **Avg** | 51.7 | 67.5 | 34.7 | 46.0 | Further languages (not supported by Elasticsearch): | Model | cohere multilingual-22-12 nDCG@10 | cohere multilingual-22-12 hit@3 | |---|---|---| | miracl-fa | 44.8 | 53.6 | | miracl-ja | 49.0 | 61.0 | | miracl-ko | 50.9 | 64.8 | | miracl-sw | 61.4 | 74.5 | | miracl-te | 67.8 | 72.3 | | miracl-th | 60.2 | 71.9 | | miracl-yo | 56.4 | 62.2 | | miracl-zh | 43.8 | 56.5 | | **Avg** | 54.3 | 64.6 |
volvoDon/necronomicon
--- license: apache-2.0 dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 178080 num_examples: 1 download_size: 0 dataset_size: 178080 configs: - config_name: default data_files: - split: train path: data/train-* ---
ganeshkamath89/my-awesome-dataset
--- license: mit ---
totally-not-an-llm/EverythingLM-data-V2
--- license: mit --- # EverythingLM V2 Dataset **EverythingLM V2** is a diverse instruct dataset consisting of 1k of human-assistant conversations. These sets were generated using principles from both evol-instruct and Orca. The dataset encompasses a wide array of topics and interactions. ### Differences for V1: - All data in V2 is generated by GPT4 - Higher quality dataset generation pipeline: - More humalike seed prompts - Fixed some bugs in the script - More diverse creative writing - More diverse seed prompts in general - Attempt not to overfit the model on complex instructions by occasionally skipping evol ### Cost: Reproducing this dataset would cost roughly $40. ### Instruction Categories: - Reasoning - Creative Writing - General Knowledge - Brainstorming - Search Query - Coding - Basic Instruct We also leverage various system prompts for evol-instruct and for responding to prompts. This dataset has also been filtered to remove OpenAI alignment. ### How it stands out: - Long, detailed outputs - Humanlike creativity - CoT reasoning - Complex & challenging tasks ### Plans: - Train Llama 7b & 13b models (13b model V1 trained) - Train Llama 70b QLoRA - Generate V2 of the dataset, with more categories and GPT-4 (DONE) ✓ Included in this repo is the script to generate the dataset.
muibk/emea_en-de_20k
--- dataset_info: features: - name: id dtype: int64 - name: translation struct: - name: de dtype: string - name: en dtype: string splits: - name: train num_bytes: 2746077.5302918116 num_examples: 18000 - name: test num_bytes: 152559.8627939895 num_examples: 1000 - name: valid num_bytes: 152559.8627939895 num_examples: 1000 download_size: 1901357 dataset_size: 3051197.2558797905 --- # Dataset Card for "emea_en-de_20k" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
lmms-lab/MMBench
--- dataset_info: - config_name: cc features: - name: index dtype: int64 - name: question dtype: string - name: answer dtype: string - name: A dtype: string - name: B dtype: string - name: C dtype: string - name: D dtype: string - name: category dtype: string - name: image dtype: image - name: source dtype: string splits: - name: test num_bytes: 51822980.0 num_examples: 2040 download_size: 51151713 dataset_size: 51822980.0 - config_name: cn features: - name: index dtype: int64 - name: question dtype: string - name: hint dtype: string - name: answer dtype: string - name: A dtype: string - name: B dtype: string - name: C dtype: string - name: D dtype: string - name: category dtype: string - name: image dtype: image - name: source dtype: string - name: L2-category dtype: string - name: comment dtype: string - name: split dtype: string splits: - name: dev num_bytes: 102697367.875 num_examples: 4329 - name: test num_bytes: 148085952.75 num_examples: 6666 download_size: 238008307 dataset_size: 250783320.625 - config_name: en features: - name: index dtype: int64 - name: question dtype: string - name: hint dtype: string - name: answer dtype: string - name: A dtype: string - name: B dtype: string - name: C dtype: string - name: D dtype: string - name: category dtype: string - name: image dtype: image - name: source dtype: string - name: L2-category dtype: string - name: comment dtype: string - name: split dtype: string splits: - name: dev num_bytes: 102785426.875 num_examples: 4329 - name: test num_bytes: 148216865.75 num_examples: 6666 download_size: 238044917 dataset_size: 251002292.625 configs: - config_name: cc data_files: - split: test path: cc/test-* - config_name: cn data_files: - split: dev path: cn/dev-* - split: test path: cn/test-* - config_name: en data_files: - split: dev path: en/dev-* - split: test path: en/test-* ---
tyzhu/random25eof_find_passage_train500000_eval1000_rare
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* dataset_info: features: - name: inputs dtype: string - name: targets dtype: string splits: - name: train num_bytes: 104305810 num_examples: 1001000 - name: validation num_bytes: 118222 num_examples: 1000 download_size: 0 dataset_size: 104424032 --- # Dataset Card for "random25eof_find_passage_train500000_eval1000_rare" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
AnkitSatpute/zb_top1000_ttv_str
--- dataset_info: features: - name: text dtype: string - name: label dtype: float64 splits: - name: train num_bytes: 4202798 num_examples: 136517 - name: test num_bytes: 4248762 num_examples: 136566 - name: validation num_bytes: 1665826 num_examples: 56172 download_size: 2839517 dataset_size: 10117386 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* - split: validation path: data/validation-* ---
freddyaboulton/dope_data_points_14
--- configs: - config_name: default data_files: - split: train path: data.csv --- # Dataset Card for Dataset Name ## Dataset Description - **Homepage:** - **Repository:** - **Paper:** - **Leaderboard:** - **Point of Contact:** ### Dataset Summary [More Information Needed] ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_PulsarAI__Einstein-v3-7B
--- pretty_name: Evaluation run of Weyaxi/Einstein-v3-7B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Weyaxi/Einstein-v3-7B](https://huggingface.co/Weyaxi/Einstein-v3-7B) on the\ \ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Einstein-v3-7B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-02-09T14:20:50.060350](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Einstein-v3-7B/blob/main/results_2024-02-09T14-20-50.060350.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6324191881027033,\n\ \ \"acc_stderr\": 0.03243554886430901,\n \"acc_norm\": 0.6363751404085887,\n\ \ \"acc_norm_stderr\": 0.033091894253237775,\n \"mc1\": 0.3488372093023256,\n\ \ \"mc1_stderr\": 0.016684419859986893,\n \"mc2\": 0.5118155053333627,\n\ \ \"mc2_stderr\": 0.014996398703517707\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6023890784982935,\n \"acc_stderr\": 0.014301752223279542,\n\ \ \"acc_norm\": 0.6228668941979523,\n \"acc_norm_stderr\": 0.0141633668961926\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6344353714399522,\n\ \ \"acc_stderr\": 0.004806039039008958,\n \"acc_norm\": 0.8301135232025493,\n\ \ \"acc_norm_stderr\": 0.0037476555337545205\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\ \ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\ \ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\ \ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.0387813988879761,\n\ \ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.0387813988879761\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\ \ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\ : 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\ acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \ \ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\ \ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\ \ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \ \ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\ \ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \ \ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\ \ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\ \ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\ \ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n\ \ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\ \ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\ \ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\ \ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\ \ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155243,\n \"\ acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155243\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\ \ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\ \ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6838709677419355,\n\ \ \"acc_stderr\": 0.026450874489042774,\n \"acc_norm\": 0.6838709677419355,\n\ \ \"acc_norm_stderr\": 0.026450874489042774\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\ \ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\ : 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\ \ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386417,\n \"\ acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386417\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758723,\n\ \ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758723\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6384615384615384,\n \"acc_stderr\": 0.024359581465396993,\n\ \ \"acc_norm\": 0.6384615384615384,\n \"acc_norm_stderr\": 0.024359581465396993\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \ \ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566548,\n\ \ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566548\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\ acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8201834862385321,\n \"acc_stderr\": 0.016465345467391528,\n \"\ acc_norm\": 0.8201834862385321,\n \"acc_norm_stderr\": 0.016465345467391528\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\ acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\ acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \ \ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\ \ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\ \ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n\ \ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098825,\n \"\ acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098825\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\ \ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\ \ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615771,\n\ \ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615771\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\ \ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\ \ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\ \ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\ \ \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n\ \ \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \ \ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\ \ \"acc_stderr\": 0.013507943909371802,\n \"acc_norm\": 0.8275862068965517,\n\ \ \"acc_norm_stderr\": 0.013507943909371802\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n\ \ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4402234636871508,\n\ \ \"acc_stderr\": 0.01660256461504993,\n \"acc_norm\": 0.4402234636871508,\n\ \ \"acc_norm_stderr\": 0.01660256461504993\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n\ \ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\ \ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\ \ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.02563082497562135,\n\ \ \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.02563082497562135\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \ \ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4595827900912647,\n\ \ \"acc_stderr\": 0.012728446067669971,\n \"acc_norm\": 0.4595827900912647,\n\ \ \"acc_norm_stderr\": 0.012728446067669971\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \ \ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6486928104575164,\n \"acc_stderr\": 0.019312676065786554,\n \ \ \"acc_norm\": 0.6486928104575164,\n \"acc_norm_stderr\": 0.019312676065786554\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\ \ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\ \ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\ \ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n\ \ \"acc_stderr\": 0.030360490154014635,\n \"acc_norm\": 0.7562189054726368,\n\ \ \"acc_norm_stderr\": 0.030360490154014635\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \ \ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\ \ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\ \ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\ \ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3488372093023256,\n\ \ \"mc1_stderr\": 0.016684419859986893,\n \"mc2\": 0.5118155053333627,\n\ \ \"mc2_stderr\": 0.014996398703517707\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7995264404104183,\n \"acc_stderr\": 0.011251958281205083\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.44806671721000757,\n \ \ \"acc_stderr\": 0.013697992668274523\n }\n}\n```" repo_url: https://huggingface.co/Weyaxi/Einstein-v3-7B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|arc:challenge|25_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-02-09T14-20-50.060350.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|gsm8k|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hellaswag|10_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-09T14-20-50.060350.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-management|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-virology|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T14-20-50.060350.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|truthfulqa:mc|0_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-02-09T14-20-50.060350.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_02_09T14_20_50.060350 path: - '**/details_harness|winogrande|5_2024-02-09T14-20-50.060350.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-02-09T14-20-50.060350.parquet' - config_name: results data_files: - split: 2024_02_09T14_20_50.060350 path: - results_2024-02-09T14-20-50.060350.parquet - split: latest path: - results_2024-02-09T14-20-50.060350.parquet --- # Dataset Card for Evaluation run of Weyaxi/Einstein-v3-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Weyaxi/Einstein-v3-7B](https://huggingface.co/Weyaxi/Einstein-v3-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Weyaxi__Einstein-v3-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-09T14:20:50.060350](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Einstein-v3-7B/blob/main/results_2024-02-09T14-20-50.060350.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6324191881027033, "acc_stderr": 0.03243554886430901, "acc_norm": 0.6363751404085887, "acc_norm_stderr": 0.033091894253237775, "mc1": 0.3488372093023256, "mc1_stderr": 0.016684419859986893, "mc2": 0.5118155053333627, "mc2_stderr": 0.014996398703517707 }, "harness|arc:challenge|25": { "acc": 0.6023890784982935, "acc_stderr": 0.014301752223279542, "acc_norm": 0.6228668941979523, "acc_norm_stderr": 0.0141633668961926 }, "harness|hellaswag|10": { "acc": 0.6344353714399522, "acc_stderr": 0.004806039039008958, "acc_norm": 0.8301135232025493, "acc_norm_stderr": 0.0037476555337545205 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.04153948404742398, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.04153948404742398 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6513157894736842, "acc_stderr": 0.0387813988879761, "acc_norm": 0.6513157894736842, "acc_norm_stderr": 0.0387813988879761 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6867924528301886, "acc_stderr": 0.028544793319055326, "acc_norm": 0.6867924528301886, "acc_norm_stderr": 0.028544793319055326 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7222222222222222, "acc_stderr": 0.037455547914624555, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.037455547914624555 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.43, "acc_stderr": 0.04975698519562428, "acc_norm": 0.43, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.38, "acc_stderr": 0.04878317312145633, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145633 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6358381502890174, "acc_stderr": 0.03669072477416907, "acc_norm": 0.6358381502890174, "acc_norm_stderr": 0.03669072477416907 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.37254901960784315, "acc_stderr": 0.04810840148082636, "acc_norm": 0.37254901960784315, "acc_norm_stderr": 0.04810840148082636 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.8, "acc_stderr": 0.04020151261036845, "acc_norm": 0.8, "acc_norm_stderr": 0.04020151261036845 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5276595744680851, "acc_stderr": 0.03263597118409769, "acc_norm": 0.5276595744680851, "acc_norm_stderr": 0.03263597118409769 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4649122807017544, "acc_stderr": 0.046920083813689104, "acc_norm": 0.4649122807017544, "acc_norm_stderr": 0.046920083813689104 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3783068783068783, "acc_stderr": 0.024976954053155243, "acc_norm": 0.3783068783068783, "acc_norm_stderr": 0.024976954053155243 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3888888888888889, "acc_stderr": 0.04360314860077459, "acc_norm": 0.3888888888888889, "acc_norm_stderr": 0.04360314860077459 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6838709677419355, "acc_stderr": 0.026450874489042774, "acc_norm": 0.6838709677419355, "acc_norm_stderr": 0.026450874489042774 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5123152709359606, "acc_stderr": 0.035169204442208966, "acc_norm": 0.5123152709359606, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.032876667586034906, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.032876667586034906 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7727272727272727, "acc_stderr": 0.029857515673386417, "acc_norm": 0.7727272727272727, "acc_norm_stderr": 0.029857515673386417 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8704663212435233, "acc_stderr": 0.024233532297758723, "acc_norm": 0.8704663212435233, "acc_norm_stderr": 0.024233532297758723 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6384615384615384, "acc_stderr": 0.024359581465396993, "acc_norm": 0.6384615384615384, "acc_norm_stderr": 0.024359581465396993 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32222222222222224, "acc_stderr": 0.028493465091028593, "acc_norm": 0.32222222222222224, "acc_norm_stderr": 0.028493465091028593 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6512605042016807, "acc_stderr": 0.030956636328566548, "acc_norm": 0.6512605042016807, "acc_norm_stderr": 0.030956636328566548 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.038615575462551684, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.038615575462551684 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8201834862385321, "acc_stderr": 0.016465345467391528, "acc_norm": 0.8201834862385321, "acc_norm_stderr": 0.016465345467391528 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5416666666666666, "acc_stderr": 0.03398110890294636, "acc_norm": 0.5416666666666666, "acc_norm_stderr": 0.03398110890294636 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8137254901960784, "acc_stderr": 0.027325470966716312, "acc_norm": 0.8137254901960784, "acc_norm_stderr": 0.027325470966716312 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7932489451476793, "acc_stderr": 0.0263616516683891, "acc_norm": 0.7932489451476793, "acc_norm_stderr": 0.0263616516683891 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7480916030534351, "acc_stderr": 0.03807387116306085, "acc_norm": 0.7480916030534351, "acc_norm_stderr": 0.03807387116306085 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098825, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098825 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7962962962962963, "acc_stderr": 0.03893542518824847, "acc_norm": 0.7962962962962963, "acc_norm_stderr": 0.03893542518824847 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7730061349693251, "acc_stderr": 0.03291099578615771, "acc_norm": 0.7730061349693251, "acc_norm_stderr": 0.03291099578615771 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.49107142857142855, "acc_stderr": 0.04745033255489123, "acc_norm": 0.49107142857142855, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.7475728155339806, "acc_stderr": 0.04301250399690878, "acc_norm": 0.7475728155339806, "acc_norm_stderr": 0.04301250399690878 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406974, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406974 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8275862068965517, "acc_stderr": 0.013507943909371802, "acc_norm": 0.8275862068965517, "acc_norm_stderr": 0.013507943909371802 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7254335260115607, "acc_stderr": 0.02402774515526502, "acc_norm": 0.7254335260115607, "acc_norm_stderr": 0.02402774515526502 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4402234636871508, "acc_stderr": 0.01660256461504993, "acc_norm": 0.4402234636871508, "acc_norm_stderr": 0.01660256461504993 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7450980392156863, "acc_stderr": 0.02495418432487991, "acc_norm": 0.7450980392156863, "acc_norm_stderr": 0.02495418432487991 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.707395498392283, "acc_stderr": 0.02583989833487798, "acc_norm": 0.707395498392283, "acc_norm_stderr": 0.02583989833487798 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6944444444444444, "acc_stderr": 0.02563082497562135, "acc_norm": 0.6944444444444444, "acc_norm_stderr": 0.02563082497562135 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4787234042553192, "acc_stderr": 0.029800481645628693, "acc_norm": 0.4787234042553192, "acc_norm_stderr": 0.029800481645628693 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4595827900912647, "acc_stderr": 0.012728446067669971, "acc_norm": 0.4595827900912647, "acc_norm_stderr": 0.012728446067669971 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6875, "acc_stderr": 0.02815637344037142, "acc_norm": 0.6875, "acc_norm_stderr": 0.02815637344037142 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6486928104575164, "acc_stderr": 0.019312676065786554, "acc_norm": 0.6486928104575164, "acc_norm_stderr": 0.019312676065786554 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7428571428571429, "acc_stderr": 0.02797982353874455, "acc_norm": 0.7428571428571429, "acc_norm_stderr": 0.02797982353874455 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7562189054726368, "acc_stderr": 0.030360490154014635, "acc_norm": 0.7562189054726368, "acc_norm_stderr": 0.030360490154014635 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.88, "acc_stderr": 0.03265986323710906, "acc_norm": 0.88, "acc_norm_stderr": 0.03265986323710906 }, "harness|hendrycksTest-virology|5": { "acc": 0.5060240963855421, "acc_stderr": 0.03892212195333045, "acc_norm": 0.5060240963855421, "acc_norm_stderr": 0.03892212195333045 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.3488372093023256, "mc1_stderr": 0.016684419859986893, "mc2": 0.5118155053333627, "mc2_stderr": 0.014996398703517707 }, "harness|winogrande|5": { "acc": 0.7995264404104183, "acc_stderr": 0.011251958281205083 }, "harness|gsm8k|5": { "acc": 0.44806671721000757, "acc_stderr": 0.013697992668274523 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
CyberHarem/marblehead_azurlane
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of marblehead/マーブルヘッド/马布尔黑德 (Azur Lane) This is the dataset of marblehead/マーブルヘッド/马布尔黑德 (Azur Lane), containing 74 images and their tags. The core tags of this character are `blonde_hair, blue_eyes, breasts, hair_ornament, multicolored_hair, large_breasts, hairclip, pink_hair, two-tone_hair, hair_between_eyes, bangs, sidelocks`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 74 | 97.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marblehead_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 74 | 59.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marblehead_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 177 | 122.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marblehead_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 74 | 86.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marblehead_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 177 | 171.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marblehead_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/marblehead_azurlane', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, solo, bare_shoulders, smile, black_pants, blush, midriff, navel, parted_lips, short_hair_with_long_locks, sleeveless, cleavage_cutout, gyaru, long_hair, standing, sweat, crop_top, sports_bra, bare_arms, black_shirt, purple_hair, simple_background, symbol-shaped_pupils, white_background | | 1 | 9 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, breast_tattoo, gyaru, looking_at_viewer, navel_piercing, short_hair_with_long_locks, short_shorts, smile, solo, thighhighs, cleavage, id_card, black_shorts, leotard_under_clothes, simple_background, white_background, full_body, white_coat, sleeves_past_fingers | | 2 | 11 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, looking_at_viewer, bare_shoulders, garter_straps, smile, solo, black_thighhighs, blush, ass, off-shoulder_sweater, short_hair_with_long_locks, cleavage, official_alternate_costume, standing, sweater_dress, christmas, from_behind, gift, long_hair | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | bare_shoulders | smile | black_pants | blush | midriff | navel | parted_lips | short_hair_with_long_locks | sleeveless | cleavage_cutout | gyaru | long_hair | standing | sweat | crop_top | sports_bra | bare_arms | black_shirt | purple_hair | simple_background | symbol-shaped_pupils | white_background | breast_tattoo | navel_piercing | short_shorts | thighhighs | cleavage | id_card | black_shorts | leotard_under_clothes | full_body | white_coat | sleeves_past_fingers | garter_straps | black_thighhighs | ass | off-shoulder_sweater | official_alternate_costume | sweater_dress | christmas | from_behind | gift | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-----------------|:--------|:--------------|:--------|:----------|:--------|:--------------|:-----------------------------|:-------------|:------------------|:--------|:------------|:-----------|:--------|:-----------|:-------------|:------------|:--------------|:--------------|:--------------------|:-----------------------|:-------------------|:----------------|:-----------------|:---------------|:-------------|:-----------|:----------|:---------------|:------------------------|:------------|:-------------|:-----------------------|:----------------|:-------------------|:------|:-----------------------|:-----------------------------|:----------------|:------------|:--------------|:-------| | 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | 1 | 9 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | | X | | | | | | X | | | X | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | 2 | 11 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | X | | X | | | | X | | | | X | X | | | | | | | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X |
hongerzh/nft_prediction_all_NFTs
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* dataset_info: features: - name: image dtype: image - name: text dtype: string - name: label dtype: float64 - name: sold_price dtype: float64 - name: matching_speed dtype: float64 - name: time dtype: float64 splits: - name: train num_bytes: 12223325706.04 num_examples: 70256 - name: validation num_bytes: 3664228516.045 num_examples: 10035 - name: test num_bytes: 3975494326.881 num_examples: 20073 download_size: 16061854920 dataset_size: 19863048548.966 --- # Dataset Card for "nft_prediction_all_NFTs" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_84
--- dataset_info: features: - name: logits sequence: float32 - name: mfcc sequence: sequence: float64 splits: - name: train num_bytes: 1112423780.0 num_examples: 218465 download_size: 1134472201 dataset_size: 1112423780.0 --- # Dataset Card for "chunk_84" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mozay22/icd_cpt_codes
--- license: mit ---
iElexperio/processedMorDatCordV2
--- dataset_info: features: - name: image dtype: image - name: label dtype: string splits: - name: train num_bytes: 10460102.0 num_examples: 80 - name: test num_bytes: 2233473.0 num_examples: 19 download_size: 11626414 dataset_size: 12693575.0 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* --- # Dataset Card for "processedMorDatCordV2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
abhishek/autotrain-data-i6l7-e3p1-lu90
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* dataset_info: features: - name: autotrain_image dtype: image - name: autotrain_label dtype: class_label: names: '0': daisy '1': dandelion '2': rose '3': sunflower '4': tulip splits: - name: train num_bytes: 114410927.672 num_examples: 2196 - name: validation num_bytes: 33682367.0 num_examples: 550 download_size: 166945851 dataset_size: 148093294.672 --- # Dataset Card for "autotrain-data-i6l7-e3p1-lu90" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
KonradSzafer/stackoverflow_linux
--- dataset_info: features: - name: title dtype: string - name: question dtype: string - name: answer dtype: string - name: url dtype: string splits: - name: train num_bytes: 303464 num_examples: 270 - name: test num_bytes: 37456 num_examples: 30 download_size: 172425 dataset_size: 340920 task_categories: - question-answering language: - en pretty_name: Stack Overflow Linux size_categories: - n<1K --- # Dataset Card for "stackoverflow_linux" Dataset information: - Source: Stack Overflow - Category: Linux - Number of samples: 300 - Train/Test split: 270/30 - Quality: Data come from the top 1k most upvoted questions ## Additional Information ### License All Stack Overflow user contributions are licensed under CC-BY-SA 3.0 with attribution required. [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
IanTseng/Med-term1
--- dataset_info: features: - name: TEXT dtype: string - name: LOCATION dtype: string - name: LABEL dtype: string splits: - name: train num_bytes: 4338102515 num_examples: 4000000 download_size: 2415329628 dataset_size: 4338102515 configs: - config_name: default data_files: - split: train path: data/train-* ---
EgilKarlsen/Thunderbird_RoBERTa_Finetuned
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* dataset_info: features: - name: '0' dtype: float32 - name: '1' dtype: float32 - name: '2' dtype: float32 - name: '3' dtype: float32 - name: '4' dtype: float32 - name: '5' dtype: float32 - name: '6' dtype: float32 - name: '7' dtype: float32 - name: '8' dtype: float32 - name: '9' dtype: float32 - name: '10' dtype: float32 - name: '11' dtype: float32 - name: '12' dtype: float32 - name: '13' dtype: float32 - name: '14' dtype: float32 - name: '15' dtype: float32 - name: '16' dtype: float32 - name: '17' dtype: float32 - name: '18' dtype: float32 - name: '19' dtype: float32 - name: '20' dtype: float32 - name: '21' dtype: float32 - name: '22' dtype: float32 - name: '23' dtype: float32 - name: '24' dtype: float32 - name: '25' dtype: float32 - name: '26' dtype: float32 - name: '27' dtype: float32 - name: '28' dtype: float32 - name: '29' dtype: float32 - name: '30' dtype: float32 - name: '31' dtype: float32 - name: '32' dtype: float32 - name: '33' dtype: float32 - name: '34' dtype: float32 - name: '35' dtype: float32 - name: '36' dtype: float32 - name: '37' dtype: float32 - name: '38' dtype: float32 - name: '39' dtype: float32 - name: '40' dtype: float32 - name: '41' dtype: float32 - name: '42' dtype: float32 - name: '43' dtype: float32 - name: '44' dtype: float32 - name: '45' dtype: float32 - name: '46' dtype: float32 - name: '47' dtype: float32 - name: '48' dtype: float32 - name: '49' dtype: float32 - name: '50' dtype: float32 - name: '51' dtype: float32 - name: '52' dtype: float32 - name: '53' dtype: float32 - name: '54' dtype: float32 - name: '55' dtype: float32 - name: '56' dtype: float32 - name: '57' dtype: float32 - name: '58' dtype: float32 - name: '59' dtype: float32 - name: '60' dtype: float32 - name: '61' dtype: float32 - name: '62' dtype: float32 - name: '63' dtype: float32 - name: '64' dtype: float32 - name: '65' dtype: float32 - name: '66' dtype: float32 - name: '67' dtype: float32 - name: '68' dtype: float32 - name: '69' dtype: float32 - name: '70' dtype: float32 - name: '71' dtype: float32 - name: '72' dtype: float32 - name: '73' dtype: float32 - name: '74' dtype: float32 - name: '75' dtype: float32 - name: '76' dtype: float32 - name: '77' dtype: float32 - name: '78' dtype: float32 - name: '79' dtype: float32 - name: '80' dtype: float32 - name: '81' dtype: float32 - name: '82' dtype: float32 - name: '83' dtype: float32 - name: '84' dtype: float32 - name: '85' dtype: float32 - name: '86' dtype: float32 - name: '87' dtype: float32 - name: '88' dtype: float32 - name: '89' dtype: float32 - name: '90' dtype: float32 - name: '91' dtype: float32 - name: '92' dtype: float32 - name: '93' dtype: float32 - name: '94' dtype: float32 - name: '95' dtype: float32 - name: '96' dtype: float32 - name: '97' dtype: float32 - name: '98' dtype: float32 - name: '99' dtype: float32 - name: '100' dtype: float32 - name: '101' dtype: float32 - name: '102' dtype: float32 - name: '103' dtype: float32 - name: '104' dtype: float32 - name: '105' dtype: float32 - name: '106' dtype: float32 - name: '107' dtype: float32 - name: '108' dtype: float32 - name: '109' dtype: float32 - name: '110' dtype: float32 - name: '111' dtype: float32 - name: '112' dtype: float32 - name: '113' dtype: float32 - name: '114' dtype: float32 - name: '115' dtype: float32 - name: '116' dtype: float32 - name: '117' dtype: float32 - name: '118' dtype: float32 - name: '119' dtype: float32 - name: '120' dtype: float32 - name: '121' dtype: float32 - name: '122' dtype: float32 - name: '123' dtype: float32 - name: '124' dtype: float32 - name: '125' dtype: float32 - name: '126' dtype: float32 - name: '127' dtype: float32 - name: '128' dtype: float32 - name: '129' dtype: float32 - name: '130' dtype: float32 - name: '131' dtype: float32 - name: '132' dtype: float32 - name: '133' dtype: float32 - name: '134' dtype: float32 - name: '135' dtype: float32 - name: '136' dtype: float32 - name: '137' dtype: float32 - name: '138' dtype: float32 - name: '139' dtype: float32 - name: '140' dtype: float32 - name: '141' dtype: float32 - name: '142' dtype: float32 - name: '143' dtype: float32 - name: '144' dtype: float32 - name: '145' dtype: float32 - name: '146' dtype: float32 - name: '147' dtype: float32 - name: '148' dtype: float32 - name: '149' dtype: float32 - name: '150' dtype: float32 - name: '151' dtype: float32 - name: '152' dtype: float32 - name: '153' dtype: float32 - name: '154' dtype: float32 - name: '155' dtype: float32 - name: '156' dtype: float32 - name: '157' dtype: float32 - name: '158' dtype: float32 - name: '159' dtype: float32 - name: '160' dtype: float32 - name: '161' dtype: float32 - name: '162' dtype: float32 - name: '163' dtype: float32 - name: '164' dtype: float32 - name: '165' dtype: float32 - name: '166' dtype: float32 - name: '167' dtype: float32 - name: '168' dtype: float32 - name: '169' dtype: float32 - name: '170' dtype: float32 - name: '171' dtype: float32 - name: '172' dtype: float32 - name: '173' dtype: float32 - name: '174' dtype: float32 - name: '175' dtype: float32 - name: '176' dtype: float32 - name: '177' dtype: float32 - name: '178' dtype: float32 - name: '179' dtype: float32 - name: '180' dtype: float32 - name: '181' dtype: float32 - name: '182' dtype: float32 - name: '183' dtype: float32 - name: '184' dtype: float32 - name: '185' dtype: float32 - name: '186' dtype: float32 - name: '187' dtype: float32 - name: '188' dtype: float32 - name: '189' dtype: float32 - name: '190' dtype: float32 - name: '191' dtype: float32 - name: '192' dtype: float32 - name: '193' dtype: float32 - name: '194' dtype: float32 - name: '195' dtype: float32 - name: '196' dtype: float32 - name: '197' dtype: float32 - name: '198' dtype: float32 - name: '199' dtype: float32 - name: '200' dtype: float32 - name: '201' dtype: float32 - name: '202' dtype: float32 - name: '203' dtype: float32 - name: '204' dtype: float32 - name: '205' dtype: float32 - name: '206' dtype: float32 - name: '207' dtype: float32 - name: '208' dtype: float32 - name: '209' dtype: float32 - name: '210' dtype: float32 - name: '211' dtype: float32 - name: '212' dtype: float32 - name: '213' dtype: float32 - name: '214' dtype: float32 - name: '215' dtype: float32 - name: '216' dtype: float32 - name: '217' dtype: float32 - name: '218' dtype: float32 - name: '219' dtype: float32 - name: '220' dtype: float32 - name: '221' dtype: float32 - name: '222' dtype: float32 - name: '223' dtype: float32 - name: '224' dtype: float32 - name: '225' dtype: float32 - name: '226' dtype: float32 - name: '227' dtype: float32 - name: '228' dtype: float32 - name: '229' dtype: float32 - name: '230' dtype: float32 - name: '231' dtype: float32 - name: '232' dtype: float32 - name: '233' dtype: float32 - name: '234' dtype: float32 - name: '235' dtype: float32 - name: '236' dtype: float32 - name: '237' dtype: float32 - name: '238' dtype: float32 - name: '239' dtype: float32 - name: '240' dtype: float32 - name: '241' dtype: float32 - name: '242' dtype: float32 - name: '243' dtype: float32 - name: '244' dtype: float32 - name: '245' dtype: float32 - name: '246' dtype: float32 - name: '247' dtype: float32 - name: '248' dtype: float32 - name: '249' dtype: float32 - name: '250' dtype: float32 - name: '251' dtype: float32 - name: '252' dtype: float32 - name: '253' dtype: float32 - name: '254' dtype: float32 - name: '255' dtype: float32 - name: '256' dtype: float32 - name: '257' dtype: float32 - name: '258' dtype: float32 - name: '259' dtype: float32 - name: '260' dtype: float32 - name: '261' dtype: float32 - name: '262' dtype: float32 - name: '263' dtype: float32 - name: '264' dtype: float32 - name: '265' dtype: float32 - name: '266' dtype: float32 - name: '267' dtype: float32 - name: '268' dtype: float32 - name: '269' dtype: float32 - name: '270' dtype: float32 - name: '271' dtype: float32 - name: '272' dtype: float32 - name: '273' dtype: float32 - name: '274' dtype: float32 - name: '275' dtype: float32 - name: '276' dtype: float32 - name: '277' dtype: float32 - name: '278' dtype: float32 - name: '279' dtype: float32 - name: '280' dtype: float32 - name: '281' dtype: float32 - name: '282' dtype: float32 - name: '283' dtype: float32 - name: '284' dtype: float32 - name: '285' dtype: float32 - name: '286' dtype: float32 - name: '287' dtype: float32 - name: '288' dtype: float32 - name: '289' dtype: float32 - name: '290' dtype: float32 - name: '291' dtype: float32 - name: '292' dtype: float32 - name: '293' dtype: float32 - name: '294' dtype: float32 - name: '295' dtype: float32 - name: '296' dtype: float32 - name: '297' dtype: float32 - name: '298' dtype: float32 - name: '299' dtype: float32 - name: '300' dtype: float32 - name: '301' dtype: float32 - name: '302' dtype: float32 - name: '303' dtype: float32 - name: '304' dtype: float32 - name: '305' dtype: float32 - name: '306' dtype: float32 - name: '307' dtype: float32 - name: '308' dtype: float32 - name: '309' dtype: float32 - name: '310' dtype: float32 - name: '311' dtype: float32 - name: '312' dtype: float32 - name: '313' dtype: float32 - name: '314' dtype: float32 - name: '315' dtype: float32 - name: '316' dtype: float32 - name: '317' dtype: float32 - name: '318' dtype: float32 - name: '319' dtype: float32 - name: '320' dtype: float32 - name: '321' dtype: float32 - name: '322' dtype: float32 - name: '323' dtype: float32 - name: '324' dtype: float32 - name: '325' dtype: float32 - name: '326' dtype: float32 - name: '327' dtype: float32 - name: '328' dtype: float32 - name: '329' dtype: float32 - name: '330' dtype: float32 - name: '331' dtype: float32 - name: '332' dtype: float32 - name: '333' dtype: float32 - name: '334' dtype: float32 - name: '335' dtype: float32 - name: '336' dtype: float32 - name: '337' dtype: float32 - name: '338' dtype: float32 - name: '339' dtype: float32 - name: '340' dtype: float32 - name: '341' dtype: float32 - name: '342' dtype: float32 - name: '343' dtype: float32 - name: '344' dtype: float32 - name: '345' dtype: float32 - name: '346' dtype: float32 - name: '347' dtype: float32 - name: '348' dtype: float32 - name: '349' dtype: float32 - name: '350' dtype: float32 - name: '351' dtype: float32 - name: '352' dtype: float32 - name: '353' dtype: float32 - name: '354' dtype: float32 - name: '355' dtype: float32 - name: '356' dtype: float32 - name: '357' dtype: float32 - name: '358' dtype: float32 - name: '359' dtype: float32 - name: '360' dtype: float32 - name: '361' dtype: float32 - name: '362' dtype: float32 - name: '363' dtype: float32 - name: '364' dtype: float32 - name: '365' dtype: float32 - name: '366' dtype: float32 - name: '367' dtype: float32 - name: '368' dtype: float32 - name: '369' dtype: float32 - name: '370' dtype: float32 - name: '371' dtype: float32 - name: '372' dtype: float32 - name: '373' dtype: float32 - name: '374' dtype: float32 - name: '375' dtype: float32 - name: '376' dtype: float32 - name: '377' dtype: float32 - name: '378' dtype: float32 - name: '379' dtype: float32 - name: '380' dtype: float32 - name: '381' dtype: float32 - name: '382' dtype: float32 - name: '383' dtype: float32 - name: '384' dtype: float32 - name: '385' dtype: float32 - name: '386' dtype: float32 - name: '387' dtype: float32 - name: '388' dtype: float32 - name: '389' dtype: float32 - name: '390' dtype: float32 - name: '391' dtype: float32 - name: '392' dtype: float32 - name: '393' dtype: float32 - name: '394' dtype: float32 - name: '395' dtype: float32 - name: '396' dtype: float32 - name: '397' dtype: float32 - name: '398' dtype: float32 - name: '399' dtype: float32 - name: '400' dtype: float32 - name: '401' dtype: float32 - name: '402' dtype: float32 - name: '403' dtype: float32 - name: '404' dtype: float32 - name: '405' dtype: float32 - name: '406' dtype: float32 - name: '407' dtype: float32 - name: '408' dtype: float32 - name: '409' dtype: float32 - name: '410' dtype: float32 - name: '411' dtype: float32 - name: '412' dtype: float32 - name: '413' dtype: float32 - name: '414' dtype: float32 - name: '415' dtype: float32 - name: '416' dtype: float32 - name: '417' dtype: float32 - name: '418' dtype: float32 - name: '419' dtype: float32 - name: '420' dtype: float32 - name: '421' dtype: float32 - name: '422' dtype: float32 - name: '423' dtype: float32 - name: '424' dtype: float32 - name: '425' dtype: float32 - name: '426' dtype: float32 - name: '427' dtype: float32 - name: '428' dtype: float32 - name: '429' dtype: float32 - name: '430' dtype: float32 - name: '431' dtype: float32 - name: '432' dtype: float32 - name: '433' dtype: float32 - name: '434' dtype: float32 - name: '435' dtype: float32 - name: '436' dtype: float32 - name: '437' dtype: float32 - name: '438' dtype: float32 - name: '439' dtype: float32 - name: '440' dtype: float32 - name: '441' dtype: float32 - name: '442' dtype: float32 - name: '443' dtype: float32 - name: '444' dtype: float32 - name: '445' dtype: float32 - name: '446' dtype: float32 - name: '447' dtype: float32 - name: '448' dtype: float32 - name: '449' dtype: float32 - name: '450' dtype: float32 - name: '451' dtype: float32 - name: '452' dtype: float32 - name: '453' dtype: float32 - name: '454' dtype: float32 - name: '455' dtype: float32 - name: '456' dtype: float32 - name: '457' dtype: float32 - name: '458' dtype: float32 - name: '459' dtype: float32 - name: '460' dtype: float32 - name: '461' dtype: float32 - name: '462' dtype: float32 - name: '463' dtype: float32 - name: '464' dtype: float32 - name: '465' dtype: float32 - name: '466' dtype: float32 - name: '467' dtype: float32 - name: '468' dtype: float32 - name: '469' dtype: float32 - name: '470' dtype: float32 - name: '471' dtype: float32 - name: '472' dtype: float32 - name: '473' dtype: float32 - name: '474' dtype: float32 - name: '475' dtype: float32 - name: '476' dtype: float32 - name: '477' dtype: float32 - name: '478' dtype: float32 - name: '479' dtype: float32 - name: '480' dtype: float32 - name: '481' dtype: float32 - name: '482' dtype: float32 - name: '483' dtype: float32 - name: '484' dtype: float32 - name: '485' dtype: float32 - name: '486' dtype: float32 - name: '487' dtype: float32 - name: '488' dtype: float32 - name: '489' dtype: float32 - name: '490' dtype: float32 - name: '491' dtype: float32 - name: '492' dtype: float32 - name: '493' dtype: float32 - name: '494' dtype: float32 - name: '495' dtype: float32 - name: '496' dtype: float32 - name: '497' dtype: float32 - name: '498' dtype: float32 - name: '499' dtype: float32 - name: '500' dtype: float32 - name: '501' dtype: float32 - name: '502' dtype: float32 - name: '503' dtype: float32 - name: '504' dtype: float32 - name: '505' dtype: float32 - name: '506' dtype: float32 - name: '507' dtype: float32 - name: '508' dtype: float32 - name: '509' dtype: float32 - name: '510' dtype: float32 - name: '511' dtype: float32 - name: '512' dtype: float32 - name: '513' dtype: float32 - name: '514' dtype: float32 - name: '515' dtype: float32 - name: '516' dtype: float32 - name: '517' dtype: float32 - name: '518' dtype: float32 - name: '519' dtype: float32 - name: '520' dtype: float32 - name: '521' dtype: float32 - name: '522' dtype: float32 - name: '523' dtype: float32 - name: '524' dtype: float32 - name: '525' dtype: float32 - name: '526' dtype: float32 - name: '527' dtype: float32 - name: '528' dtype: float32 - name: '529' dtype: float32 - name: '530' dtype: float32 - name: '531' dtype: float32 - name: '532' dtype: float32 - name: '533' dtype: float32 - name: '534' dtype: float32 - name: '535' dtype: float32 - name: '536' dtype: float32 - name: '537' dtype: float32 - name: '538' dtype: float32 - name: '539' dtype: float32 - name: '540' dtype: float32 - name: '541' dtype: float32 - name: '542' dtype: float32 - name: '543' dtype: float32 - name: '544' dtype: float32 - name: '545' dtype: float32 - name: '546' dtype: float32 - name: '547' dtype: float32 - name: '548' dtype: float32 - name: '549' dtype: float32 - name: '550' dtype: float32 - name: '551' dtype: float32 - name: '552' dtype: float32 - name: '553' dtype: float32 - name: '554' dtype: float32 - name: '555' dtype: float32 - name: '556' dtype: float32 - name: '557' dtype: float32 - name: '558' dtype: float32 - name: '559' dtype: float32 - name: '560' dtype: float32 - name: '561' dtype: float32 - name: '562' dtype: float32 - name: '563' dtype: float32 - name: '564' dtype: float32 - name: '565' dtype: float32 - name: '566' dtype: float32 - name: '567' dtype: float32 - name: '568' dtype: float32 - name: '569' dtype: float32 - name: '570' dtype: float32 - name: '571' dtype: float32 - name: '572' dtype: float32 - name: '573' dtype: float32 - name: '574' dtype: float32 - name: '575' dtype: float32 - name: '576' dtype: float32 - name: '577' dtype: float32 - name: '578' dtype: float32 - name: '579' dtype: float32 - name: '580' dtype: float32 - name: '581' dtype: float32 - name: '582' dtype: float32 - name: '583' dtype: float32 - name: '584' dtype: float32 - name: '585' dtype: float32 - name: '586' dtype: float32 - name: '587' dtype: float32 - name: '588' dtype: float32 - name: '589' dtype: float32 - name: '590' dtype: float32 - name: '591' dtype: float32 - name: '592' dtype: float32 - name: '593' dtype: float32 - name: '594' dtype: float32 - name: '595' dtype: float32 - name: '596' dtype: float32 - name: '597' dtype: float32 - name: '598' dtype: float32 - name: '599' dtype: float32 - name: '600' dtype: float32 - name: '601' dtype: float32 - name: '602' dtype: float32 - name: '603' dtype: float32 - name: '604' dtype: float32 - name: '605' dtype: float32 - name: '606' dtype: float32 - name: '607' dtype: float32 - name: '608' dtype: float32 - name: '609' dtype: float32 - name: '610' dtype: float32 - name: '611' dtype: float32 - name: '612' dtype: float32 - name: '613' dtype: float32 - name: '614' dtype: float32 - name: '615' dtype: float32 - name: '616' dtype: float32 - name: '617' dtype: float32 - name: '618' dtype: float32 - name: '619' dtype: float32 - name: '620' dtype: float32 - name: '621' dtype: float32 - name: '622' dtype: float32 - name: '623' dtype: float32 - name: '624' dtype: float32 - name: '625' dtype: float32 - name: '626' dtype: float32 - name: '627' dtype: float32 - name: '628' dtype: float32 - name: '629' dtype: float32 - name: '630' dtype: float32 - name: '631' dtype: float32 - name: '632' dtype: float32 - name: '633' dtype: float32 - name: '634' dtype: float32 - name: '635' dtype: float32 - name: '636' dtype: float32 - name: '637' dtype: float32 - name: '638' dtype: float32 - name: '639' dtype: float32 - name: '640' dtype: float32 - name: '641' dtype: float32 - name: '642' dtype: float32 - name: '643' dtype: float32 - name: '644' dtype: float32 - name: '645' dtype: float32 - name: '646' dtype: float32 - name: '647' dtype: float32 - name: '648' dtype: float32 - name: '649' dtype: float32 - name: '650' dtype: float32 - name: '651' dtype: float32 - name: '652' dtype: float32 - name: '653' dtype: float32 - name: '654' dtype: float32 - name: '655' dtype: float32 - name: '656' dtype: float32 - name: '657' dtype: float32 - name: '658' dtype: float32 - name: '659' dtype: float32 - name: '660' dtype: float32 - name: '661' dtype: float32 - name: '662' dtype: float32 - name: '663' dtype: float32 - name: '664' dtype: float32 - name: '665' dtype: float32 - name: '666' dtype: float32 - name: '667' dtype: float32 - name: '668' dtype: float32 - name: '669' dtype: float32 - name: '670' dtype: float32 - name: '671' dtype: float32 - name: '672' dtype: float32 - name: '673' dtype: float32 - name: '674' dtype: float32 - name: '675' dtype: float32 - name: '676' dtype: float32 - name: '677' dtype: float32 - name: '678' dtype: float32 - name: '679' dtype: float32 - name: '680' dtype: float32 - name: '681' dtype: float32 - name: '682' dtype: float32 - name: '683' dtype: float32 - name: '684' dtype: float32 - name: '685' dtype: float32 - name: '686' dtype: float32 - name: '687' dtype: float32 - name: '688' dtype: float32 - name: '689' dtype: float32 - name: '690' dtype: float32 - name: '691' dtype: float32 - name: '692' dtype: float32 - name: '693' dtype: float32 - name: '694' dtype: float32 - name: '695' dtype: float32 - name: '696' dtype: float32 - name: '697' dtype: float32 - name: '698' dtype: float32 - name: '699' dtype: float32 - name: '700' dtype: float32 - name: '701' dtype: float32 - name: '702' dtype: float32 - name: '703' dtype: float32 - name: '704' dtype: float32 - name: '705' dtype: float32 - name: '706' dtype: float32 - name: '707' dtype: float32 - name: '708' dtype: float32 - name: '709' dtype: float32 - name: '710' dtype: float32 - name: '711' dtype: float32 - name: '712' dtype: float32 - name: '713' dtype: float32 - name: '714' dtype: float32 - name: '715' dtype: float32 - name: '716' dtype: float32 - name: '717' dtype: float32 - name: '718' dtype: float32 - name: '719' dtype: float32 - name: '720' dtype: float32 - name: '721' dtype: float32 - name: '722' dtype: float32 - name: '723' dtype: float32 - name: '724' dtype: float32 - name: '725' dtype: float32 - name: '726' dtype: float32 - name: '727' dtype: float32 - name: '728' dtype: float32 - name: '729' dtype: float32 - name: '730' dtype: float32 - name: '731' dtype: float32 - name: '732' dtype: float32 - name: '733' dtype: float32 - name: '734' dtype: float32 - name: '735' dtype: float32 - name: '736' dtype: float32 - name: '737' dtype: float32 - name: '738' dtype: float32 - name: '739' dtype: float32 - name: '740' dtype: float32 - name: '741' dtype: float32 - name: '742' dtype: float32 - name: '743' dtype: float32 - name: '744' dtype: float32 - name: '745' dtype: float32 - name: '746' dtype: float32 - name: '747' dtype: float32 - name: '748' dtype: float32 - name: '749' dtype: float32 - name: '750' dtype: float32 - name: '751' dtype: float32 - name: '752' dtype: float32 - name: '753' dtype: float32 - name: '754' dtype: float32 - name: '755' dtype: float32 - name: '756' dtype: float32 - name: '757' dtype: float32 - name: '758' dtype: float32 - name: '759' dtype: float32 - name: '760' dtype: float32 - name: '761' dtype: float32 - name: '762' dtype: float32 - name: '763' dtype: float32 - name: '764' dtype: float32 - name: '765' dtype: float32 - name: '766' dtype: float32 - name: '767' dtype: float32 - name: label dtype: string splits: - name: train num_bytes: 115576729.6875 num_examples: 37500 - name: test num_bytes: 38525577.5 num_examples: 12500 download_size: 211881241 dataset_size: 154102307.1875 --- # Dataset Card for "Thunderbird_RoBERTa_Finetuned" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-5000
--- dataset_info: features: - name: input_ids sequence: sequence: int32 - name: attention_mask sequence: sequence: int8 - name: labels sequence: sequence: int64 splits: - name: train num_bytes: 13336000 num_examples: 1000 download_size: 1070517 dataset_size: 13336000 configs: - config_name: default data_files: - split: train path: data/train-* ---
Sina-Alinejad-2002/round_operation_prediction
--- dataset_info: features: - name: text dtype: string - name: label dtype: int64 splits: - name: train num_bytes: 671661 num_examples: 711 - name: validation num_bytes: 77340 num_examples: 83 download_size: 499770 dataset_size: 749001 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* ---
Gabriel1322/MC-POZE-MODEL
--- license: openrail ---
myrtotsok/clf-noSbI
--- dataset_info: features: - name: request dtype: string - name: label dtype: int64 splits: - name: train num_bytes: 83331 num_examples: 960 - name: validation num_bytes: 20836 num_examples: 240 download_size: 24086 dataset_size: 104167 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* ---
supinyu/goat-chinese
--- license: apache-2.0 task_categories: - question-answering language: - zh size_categories: - 1M<n<10M --- goat中文算术数据集 将goat数据集的Template,更换成中文的Template,数学表达式不变
result-kand2-sdxl-wuerst-karlo/02dd1f44
--- dataset_info: features: - name: result dtype: string - name: id dtype: int64 splits: - name: train num_bytes: 158 num_examples: 10 download_size: 1302 dataset_size: 158 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "02dd1f44" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tydymy/150bp_multi_species_dataset
--- dataset_info: features: - name: '#genome' dtype: string - name: asm_name dtype: string - name: assembly_accession dtype: string - name: bioproject dtype: string - name: biosample dtype: string - name: wgs_master dtype: float64 - name: seq_rel_date dtype: string - name: submitter dtype: string - name: ftp_path dtype: string - name: img_id dtype: float64 - name: gtdb_id dtype: string - name: scope dtype: string - name: assembly_level dtype: string - name: genome_rep dtype: string - name: refseq_category dtype: string - name: release_type dtype: string - name: taxid dtype: float64 - name: species_taxid dtype: float64 - name: organism_name dtype: string - name: infraspecific_name dtype: string - name: isolate dtype: string - name: superkingdom dtype: string - name: phylum dtype: string - name: class dtype: string - name: order dtype: string - name: family dtype: string - name: genus dtype: string - name: species dtype: string - name: classified dtype: bool - name: lv1_group dtype: string - name: lv2_group dtype: string - name: score_faa dtype: float64 - name: score_fna dtype: float64 - name: score_rrna dtype: float64 - name: score_trna dtype: float64 - name: total_length dtype: float64 - name: contigs dtype: float64 - name: gc dtype: float64 - name: n50 dtype: float64 - name: l50 dtype: float64 - name: proteins dtype: float64 - name: protein_length dtype: float64 - name: coding_density dtype: float64 - name: completeness dtype: float64 - name: contamination dtype: float64 - name: strain_heterogeneity dtype: float64 - name: markers dtype: float64 - name: 5s_rrna dtype: string - name: 16s_rrna dtype: string - name: 23s_rrna dtype: string - name: trnas dtype: float64 - name: draft_quality dtype: string - name: start_position dtype: int64 - name: human_label dtype: int64 - name: autotrain_text dtype: string - name: autotrain_label dtype: class_label: names: '0': Acetobacter pasteurianus IFO 3283-01 IFO 3283 substr. IFO 3283-01 '1': Alcanivorax borkumensis SK2 '2': Aquifex aeolicus VF5 '3': Archaeoglobus fulgidus DSM 4304 '4': Azorhizobium caulinodans ORS 571 '5': Bacillus anthracis str. Ames '6': Bacillus anthracis str. Sterne ASM816v1 '7': Bacillus cereus ATCC 14579 '8': Bacillus clausii KSM-K16 '9': Bacillus pseudofirmus OF4 '10': Bacteroides fragilis YCH46 '11': Bacteroides thetaiotaomicron VPI-5482 '12': Bifidobacterium adolescentis ATCC 15703 '13': Bifidobacterium longum NCC2705 '14': Borrelia burgdorferi B31 '15': Brevibacillus brevis NBRC 100599 '16': Buchnera aphidicola str. Bp (Baizongia pistaciae) '17': Buchnera aphidicola str. Sg (Schizaphis graminum) Sg '18': Caldanaerobacter subterraneus subsp. tengcongensis MB4 '19': Candidatus Azobacteroides pseudotrichonymphae genomovar. CFP2 '20': Candidatus Vesicomyosocius okutanii HA '21': Chlamydia felis Fe/C-56 '22': Chlamydia trachomatis D/UW-3/CX '23': Chlamydophila caviae GPIC '24': Chlamydophila pneumoniae CWL029 '25': Chlamydophila pneumoniae TW-183 '26': Chlorobium tepidum TLS '27': Chromobacterium violaceum ATCC 12472 '28': Clostridioides difficile 630 ASM920v1 '29': Clostridium acetobutylicum ATCC 824 '30': Clostridium tetani E88 Massachusetts substr. E88 '31': Corynebacterium jeikeium K411 K411 = NCTC 11915 '32': Coxiella burnetii RSA 493 ASM776v1 '33': Deferribacter desulfuricans SSM1 '34': Dehalococcoides mccartyi CBDB1 '35': Deinococcus radiodurans R1 ASM856v1 '36': Desulfovibrio magneticus RS-1 '37': Enterococcus faecalis V583 ASM778v1 '38': Escherichia coli O157:H7 str. Sakai Sakai substr. RIMD 0509952 '39': Finegoldia magna ATCC 29328 '40': Francisella tularensis subsp. holarctica LVS ASM924v1 '41': Fusobacterium nucleatum subsp. nucleatum ATCC 25586 '42': Gemmatimonas aurantiaca T-27 '43': Geobacter sulfurreducens PCA '44': Haemophilus ducreyi 35000HP '45': Haloquadratum walsbyi DSM 16790 DSM 16790 = HBSQ001 '46': Helicobacter acinonychis str. Sheeba '47': Helicobacter hepaticus ATCC 51449 '48': Helicobacter pylori 26695 ASM852v1 '49': Hydrogenobacter thermophilus TK-6 ASM1078v1 '50': Idiomarina loihiensis L2TR '51': Kocuria rhizophila DC2201 '52': Lactobacillus fermentum IFO 3956 '53': Lactobacillus salivarius UCC118 '54': Lactococcus lactis subsp. lactis Il1403 IL1403 '55': Macrococcus caseolyticus JCSC5402 '56': Magnetospirillum magneticum AMB-1 '57': Mannheimia succiniciproducens MBEL55E '58': Methanocella paludicola SANAE '59': Methanococcus voltae A3 '60': Methanopyrus kandleri AV19 '61': Methanosarcina acetivorans C2A '62': Methanothermobacter thermautotrophicus str. Delta H '63': Methylococcus capsulatus str. Bath '64': Microcystis aeruginosa NIES-843 '65': Mycobacterium avium subsp. paratuberculosis K-10 '66': Neisseria gonorrhoeae FA 1090 '67': Neisseria meningitidis MC58 '68': Nitratiruptor sp. SB155-2 ASM1032v1 '69': Nitrosomonas europaea ATCC 19718 '70': Nostoc sp. PCC 7120 ASM970v1 '71': Onion yellows phytoplasma OY-M onion yellows '72': Orientia tsutsugamushi str. Ikeda '73': Pelotomaculum thermopropionicum SI '74': Picrophilus torridus DSM 9790 '75': Porphyromonas gingivalis ATCC 33277 '76': Prochlorococcus marinus subsp. marinus str. CCMP1375 '77': Propionibacterium acnes KPA171202 '78': Pseudomonas putida KT2440 '79': Pyrobaculum aerophilum str. IM2 '80': Pyrococcus furiosus DSM 3638 '81': Ralstonia solanacearum GMI1000 '82': Rickettsia conorii str. Malish 7 '83': Rickettsia typhi str. Wilmington '84': Rothia mucilaginosa DY-18 '85': Shigella flexneri 2a str. 301 '86': Sinorhizobium meliloti 1021 '87': Sodalis glossinidius str. 'morsitans' morsitans '88': Staphylococcus epidermidis ATCC 12228 ASM764v1 '89': Staphylococcus haemolyticus JCSC1435 '90': Staphylococcus saprophyticus subsp. saprophyticus ATCC 15305 ASM1012v1 '91': Streptococcus agalactiae 2603V/R '92': Streptococcus mutans UA159 '93': Streptococcus pyogenes M1 GAS SF370 '94': Streptococcus uberis 0140J '95': Streptomyces avermitilis MA-4680 = NBRC 14893 MA-4680 ASM976v2 '96': Streptomyces griseus subsp. griseus NBRC 13350 '97': Sulfolobus solfataricus P2 '98': Sulfurovum sp. NBC37-1 ASM1034v1 '99': Symbiobacterium thermophilum IAM 14863 IAM14863 '100': Synechococcus elongatus PCC 6301 '101': Synechocystis sp. PCC 6803 ASM972v1 '102': Thermococcus kodakarensis KOD1 '103': Thermotoga maritima MSB8 ASM854v1 '104': Treponema denticola ATCC 35405 '105': Treponema pallidum subsp. pallidum str. Nichols ASM860v1 '106': Tropheryma whipplei str. Twist '107': Vibrio cholerae O1 biovar El Tor str. N16961 '108': Vibrio vulnificus YJ016 '109': Wigglesworthia glossinidia endosymbiont of Glossina brevipalpis '110': Wolbachia endosymbiont of Drosophila melanogaster wMel '111': Wolbachia endosymbiont strain TRS of Brugia malayi '112': Xanthomonas campestris pv. campestris str. ATCC 33913 '113': Xanthomonas oryzae pv. oryzae KACC 10331 '114': Xylella fastidiosa 9a5c '115': Yersinia enterocolitica subsp. enterocolitica 8081 '116': Yersinia pestis CO92 ASM906v1 '117': Zymomonas mobilis subsp. mobilis ZM4 = ATCC 31821 ZM4 '118': '[Bacillus thuringiensis] serovar konkukian str. 97-27' '119': '[Pseudomonas syringae] pv. tomato str. DC3000' '120': homo sapiens splits: - name: train num_bytes: 683959051 num_examples: 1000000 - name: validation num_bytes: 68390921 num_examples: 100000 download_size: 158127793 dataset_size: 752349972 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* --- # Dataset Card for "autotrain-data-species_classify" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
teknium/dataforge-economics
--- language: - eng pretty_name: "DataForge-Economics" tags: - economics license: mit --- ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6317aade83d8d2fd903192d9/YmaINbgYmLpgTGR6ESXji.png) # Dataset Card for dataforge-economics ## Table of Contents - [Overview](#overview) - [Dataset Description](#dataset-description) - [Data Collection and Synthesis](#data-collection-and-synthesis) - [Data Structure](#data-structure) - [Licensing, Privacy, and Ethics](#licensing-privacy-and-ethics) - [Access](#access) - [Usage](#usage) - [Citation](#citation) - [Contributions](#contributions) ## Overview This dataset, `teknium/dataforge-economics`, is a specialized collection of 1,000 synthetic examples in the field of economics. It has been generated using OpenAI's GPT-4 and a custom data synthesis pipeline named DataForge, developed by me. ## Dataset Description ### Data Collection and Synthesis The data in `teknium/dataforge-economics` has been synthetically generated using OpenAI's GPT-4 language model. The synthesis process was enhanced and structured using the DataForge pipeline, which incorporates domain-specific knowledge and ensures relevance in economics topics. ### Data Structure - **Size of dataset:** 1000 examples - **Type of data:** Textual (Economics domain-specific) - **Data format:** JSON - **Fields:** - - id: a randomly generated uuid - conversations: single turn human & gpt turns in sharegpt format - source: the dataset name itself, for metadata purposes when merging with others - topic: the sub-topic for the domain - system_prompt: type of system prompt used for generating the response. ## Licensing, Privacy, and Ethics - **License:** MIT License - **Special Considerations:** This datasest is purely generated from GPT-4 data, some information may be incorrect or invalid. - **Privacy:** As the dataset is synthetically generated, it does not contain any real individual's data. ## Access - **Availability:** General Access ## Usage This dataset is a domain specialist dataset, the first to use my new pipeline called Data Forge, which can create domain expert knowledge (and tasks, as seen in the Trismegistus occult dataset) This dataset was a proof of concept to improve upon Orca model's economics expertise, which surpassed my custom benchmark for economics when finetuned over stable beluga.
BrainArtLabs/LiminalSourceDiffusionV1
--- license: cc-by-4.0 ---
hyungkwonko/chart-llm
--- license: bsd-2-clause language: - en tags: - Vega-Lite - Chart - Visualization size_categories: - 1K<n<10K ---
open-llm-leaderboard/details_Kukedlc__SuperMente-7B-v4
--- pretty_name: Evaluation run of Kukedlc/SuperMente-7B-v4 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Kukedlc/SuperMente-7B-v4](https://huggingface.co/Kukedlc/SuperMente-7B-v4) on\ \ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kukedlc__SuperMente-7B-v4\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-03-15T22:23:21.393421](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__SuperMente-7B-v4/blob/main/results_2024-03-15T22-23-21.393421.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6392802020340652,\n\ \ \"acc_stderr\": 0.03235715366568653,\n \"acc_norm\": 0.6388159808811307,\n\ \ \"acc_norm_stderr\": 0.03302845637959357,\n \"mc1\": 0.5361077111383109,\n\ \ \"mc1_stderr\": 0.017457800422268625,\n \"mc2\": 0.7145914661594538,\n\ \ \"mc2_stderr\": 0.014632935395796937\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6800341296928327,\n \"acc_stderr\": 0.013631345807016193,\n\ \ \"acc_norm\": 0.7047781569965871,\n \"acc_norm_stderr\": 0.013329750293382318\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.69398526190002,\n \ \ \"acc_stderr\": 0.00459894072237409,\n \"acc_norm\": 0.8763194582752439,\n\ \ \"acc_norm_stderr\": 0.0032854391911219137\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\ \ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\ \ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\ \ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\ \ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \ \ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\ \ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\ \ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\ \ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \ \ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\ : 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\ \ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\ \ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\ \ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\ \ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\ \ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\ \ \"acc_stderr\": 0.046774730044911984,\n \"acc_norm\": 0.4473684210526316,\n\ \ \"acc_norm_stderr\": 0.046774730044911984\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\ \ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.43386243386243384,\n \"acc_stderr\": 0.025525034382474894,\n \"\ acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.025525034382474894\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\ \ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\ \ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.7870967741935484,\n \"acc_stderr\": 0.02328766512726855,\n \"\ acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.02328766512726855\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"\ acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\ : 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\ \ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"\ acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.02293514405391945,\n\ \ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.02293514405391945\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062157,\n\ \ \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062157\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066482,\n \ \ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066482\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977927,\n\ \ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977927\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\ acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8532110091743119,\n \"acc_stderr\": 0.015173141845126243,\n \"\ acc_norm\": 0.8532110091743119,\n \"acc_norm_stderr\": 0.015173141845126243\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\ acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078966,\n \"\ acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078966\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.8312236286919831,\n \"acc_stderr\": 0.024381406832586223,\n \ \ \"acc_norm\": 0.8312236286919831,\n \"acc_norm_stderr\": 0.024381406832586223\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n\ \ \"acc_stderr\": 0.03050028317654585,\n \"acc_norm\": 0.7085201793721974,\n\ \ \"acc_norm_stderr\": 0.03050028317654585\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\ \ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"\ acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\ \ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\ \ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\ \ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\ \ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\ \ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\ \ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\ \ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\ \ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \ \ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n\ \ \"acc_stderr\": 0.01392775137200151,\n \"acc_norm\": 0.8135376756066411,\n\ \ \"acc_norm_stderr\": 0.01392775137200151\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508297,\n\ \ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508297\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4145251396648045,\n\ \ \"acc_stderr\": 0.016476342210254,\n \"acc_norm\": 0.4145251396648045,\n\ \ \"acc_norm_stderr\": 0.016476342210254\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137894,\n\ \ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137894\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\ \ \"acc_stderr\": 0.02592237178881877,\n \"acc_norm\": 0.7041800643086816,\n\ \ \"acc_norm_stderr\": 0.02592237178881877\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.02517104191530968,\n\ \ \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.02517104191530968\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \ \ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4621903520208605,\n\ \ \"acc_stderr\": 0.012733671880342511,\n \"acc_norm\": 0.4621903520208605,\n\ \ \"acc_norm_stderr\": 0.012733671880342511\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n\ \ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \ \ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\ \ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\ \ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.02927956741106568,\n\ \ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.02927956741106568\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\ \ \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n\ \ \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \ \ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\ \ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\ \ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\ \ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5361077111383109,\n\ \ \"mc1_stderr\": 0.017457800422268625,\n \"mc2\": 0.7145914661594538,\n\ \ \"mc2_stderr\": 0.014632935395796937\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8208366219415943,\n \"acc_stderr\": 0.010777949156047987\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6921910538286581,\n \ \ \"acc_stderr\": 0.012714401009923647\n }\n}\n```" repo_url: https://huggingface.co/Kukedlc/SuperMente-7B-v4 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|arc:challenge|25_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-03-15T22-23-21.393421.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|gsm8k|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hellaswag|10_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-15T22-23-21.393421.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-management|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-virology|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T22-23-21.393421.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|truthfulqa:mc|0_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-03-15T22-23-21.393421.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_03_15T22_23_21.393421 path: - '**/details_harness|winogrande|5_2024-03-15T22-23-21.393421.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-03-15T22-23-21.393421.parquet' - config_name: results data_files: - split: 2024_03_15T22_23_21.393421 path: - results_2024-03-15T22-23-21.393421.parquet - split: latest path: - results_2024-03-15T22-23-21.393421.parquet --- # Dataset Card for Evaluation run of Kukedlc/SuperMente-7B-v4 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Kukedlc/SuperMente-7B-v4](https://huggingface.co/Kukedlc/SuperMente-7B-v4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Kukedlc__SuperMente-7B-v4", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-03-15T22:23:21.393421](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__SuperMente-7B-v4/blob/main/results_2024-03-15T22-23-21.393421.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6392802020340652, "acc_stderr": 0.03235715366568653, "acc_norm": 0.6388159808811307, "acc_norm_stderr": 0.03302845637959357, "mc1": 0.5361077111383109, "mc1_stderr": 0.017457800422268625, "mc2": 0.7145914661594538, "mc2_stderr": 0.014632935395796937 }, "harness|arc:challenge|25": { "acc": 0.6800341296928327, "acc_stderr": 0.013631345807016193, "acc_norm": 0.7047781569965871, "acc_norm_stderr": 0.013329750293382318 }, "harness|hellaswag|10": { "acc": 0.69398526190002, "acc_stderr": 0.00459894072237409, "acc_norm": 0.8763194582752439, "acc_norm_stderr": 0.0032854391911219137 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.562962962962963, "acc_stderr": 0.04284958639753401, "acc_norm": 0.562962962962963, "acc_norm_stderr": 0.04284958639753401 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6907894736842105, "acc_stderr": 0.037610708698674805, "acc_norm": 0.6907894736842105, "acc_norm_stderr": 0.037610708698674805 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6830188679245283, "acc_stderr": 0.02863723563980089, "acc_norm": 0.6830188679245283, "acc_norm_stderr": 0.02863723563980089 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7361111111111112, "acc_stderr": 0.03685651095897532, "acc_norm": 0.7361111111111112, "acc_norm_stderr": 0.03685651095897532 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6473988439306358, "acc_stderr": 0.036430371689585475, "acc_norm": 0.6473988439306358, "acc_norm_stderr": 0.036430371689585475 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.04690650298201942, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.04690650298201942 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.574468085106383, "acc_stderr": 0.03232146916224468, "acc_norm": 0.574468085106383, "acc_norm_stderr": 0.03232146916224468 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4473684210526316, "acc_stderr": 0.046774730044911984, "acc_norm": 0.4473684210526316, "acc_norm_stderr": 0.046774730044911984 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5379310344827586, "acc_stderr": 0.04154659671707548, "acc_norm": 0.5379310344827586, "acc_norm_stderr": 0.04154659671707548 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.43386243386243384, "acc_stderr": 0.025525034382474894, "acc_norm": 0.43386243386243384, "acc_norm_stderr": 0.025525034382474894 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.49206349206349204, "acc_stderr": 0.044715725362943486, "acc_norm": 0.49206349206349204, "acc_norm_stderr": 0.044715725362943486 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7870967741935484, "acc_stderr": 0.02328766512726855, "acc_norm": 0.7870967741935484, "acc_norm_stderr": 0.02328766512726855 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4876847290640394, "acc_stderr": 0.035169204442208966, "acc_norm": 0.4876847290640394, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7818181818181819, "acc_stderr": 0.03225078108306289, "acc_norm": 0.7818181818181819, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7777777777777778, "acc_stderr": 0.02962022787479049, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.02962022787479049 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8860103626943006, "acc_stderr": 0.02293514405391945, "acc_norm": 0.8860103626943006, "acc_norm_stderr": 0.02293514405391945 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6461538461538462, "acc_stderr": 0.024243783994062157, "acc_norm": 0.6461538461538462, "acc_norm_stderr": 0.024243783994062157 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3148148148148148, "acc_stderr": 0.028317533496066482, "acc_norm": 0.3148148148148148, "acc_norm_stderr": 0.028317533496066482 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6890756302521008, "acc_stderr": 0.030066761582977927, "acc_norm": 0.6890756302521008, "acc_norm_stderr": 0.030066761582977927 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.038796870240733264, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.038796870240733264 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8532110091743119, "acc_stderr": 0.015173141845126243, "acc_norm": 0.8532110091743119, "acc_norm_stderr": 0.015173141845126243 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5231481481481481, "acc_stderr": 0.03406315360711507, "acc_norm": 0.5231481481481481, "acc_norm_stderr": 0.03406315360711507 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8235294117647058, "acc_stderr": 0.026756401538078966, "acc_norm": 0.8235294117647058, "acc_norm_stderr": 0.026756401538078966 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8312236286919831, "acc_stderr": 0.024381406832586223, "acc_norm": 0.8312236286919831, "acc_norm_stderr": 0.024381406832586223 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7085201793721974, "acc_stderr": 0.03050028317654585, "acc_norm": 0.7085201793721974, "acc_norm_stderr": 0.03050028317654585 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7709923664122137, "acc_stderr": 0.036853466317118506, "acc_norm": 0.7709923664122137, "acc_norm_stderr": 0.036853466317118506 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7603305785123967, "acc_stderr": 0.03896878985070417, "acc_norm": 0.7603305785123967, "acc_norm_stderr": 0.03896878985070417 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7314814814814815, "acc_stderr": 0.042844679680521934, "acc_norm": 0.7314814814814815, "acc_norm_stderr": 0.042844679680521934 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7668711656441718, "acc_stderr": 0.0332201579577674, "acc_norm": 0.7668711656441718, "acc_norm_stderr": 0.0332201579577674 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4017857142857143, "acc_stderr": 0.04653333146973646, "acc_norm": 0.4017857142857143, "acc_norm_stderr": 0.04653333146973646 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8675213675213675, "acc_stderr": 0.022209309073165616, "acc_norm": 0.8675213675213675, "acc_norm_stderr": 0.022209309073165616 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.68, "acc_stderr": 0.04688261722621504, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8135376756066411, "acc_stderr": 0.01392775137200151, "acc_norm": 0.8135376756066411, "acc_norm_stderr": 0.01392775137200151 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7341040462427746, "acc_stderr": 0.023786203255508297, "acc_norm": 0.7341040462427746, "acc_norm_stderr": 0.023786203255508297 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4145251396648045, "acc_stderr": 0.016476342210254, "acc_norm": 0.4145251396648045, "acc_norm_stderr": 0.016476342210254 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7222222222222222, "acc_stderr": 0.025646863097137894, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.025646863097137894 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7041800643086816, "acc_stderr": 0.02592237178881877, "acc_norm": 0.7041800643086816, "acc_norm_stderr": 0.02592237178881877 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7129629629629629, "acc_stderr": 0.02517104191530968, "acc_norm": 0.7129629629629629, "acc_norm_stderr": 0.02517104191530968 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4858156028368794, "acc_stderr": 0.02981549448368206, "acc_norm": 0.4858156028368794, "acc_norm_stderr": 0.02981549448368206 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4621903520208605, "acc_stderr": 0.012733671880342511, "acc_norm": 0.4621903520208605, "acc_norm_stderr": 0.012733671880342511 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6617647058823529, "acc_stderr": 0.028739328513983572, "acc_norm": 0.6617647058823529, "acc_norm_stderr": 0.028739328513983572 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6683006535947712, "acc_stderr": 0.01904748523936038, "acc_norm": 0.6683006535947712, "acc_norm_stderr": 0.01904748523936038 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7020408163265306, "acc_stderr": 0.02927956741106568, "acc_norm": 0.7020408163265306, "acc_norm_stderr": 0.02927956741106568 }, "harness|hendrycksTest-sociology|5": { "acc": 0.845771144278607, "acc_stderr": 0.02553843336857833, "acc_norm": 0.845771144278607, "acc_norm_stderr": 0.02553843336857833 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774708, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774708 }, "harness|hendrycksTest-virology|5": { "acc": 0.5421686746987951, "acc_stderr": 0.0387862677100236, "acc_norm": 0.5421686746987951, "acc_norm_stderr": 0.0387862677100236 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8070175438596491, "acc_stderr": 0.030267457554898458, "acc_norm": 0.8070175438596491, "acc_norm_stderr": 0.030267457554898458 }, "harness|truthfulqa:mc|0": { "mc1": 0.5361077111383109, "mc1_stderr": 0.017457800422268625, "mc2": 0.7145914661594538, "mc2_stderr": 0.014632935395796937 }, "harness|winogrande|5": { "acc": 0.8208366219415943, "acc_stderr": 0.010777949156047987 }, "harness|gsm8k|5": { "acc": 0.6921910538286581, "acc_stderr": 0.012714401009923647 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
CyberHarem/misaka_mikoto_bluearchive
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of misaka_mikoto/御坂美琴/御坂美琴 (Blue Archive) This is the dataset of misaka_mikoto/御坂美琴/御坂美琴 (Blue Archive), containing 500 images and their tags. The core tags of this character are `brown_hair, brown_eyes, short_hair, hair_ornament, medium_hair`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 638.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/misaka_mikoto_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 1200 | 500 | 561.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/misaka_mikoto_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1181 | 1.03 GiB | [Download](https://huggingface.co/datasets/CyberHarem/misaka_mikoto_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/misaka_mikoto_bluearchive', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 23 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, summer_uniform, tokiwadai_school_uniform, white_shirt, looking_at_viewer, brown_sweater_vest, short_sleeves, simple_background, white_background, v-neck, collared_shirt, blush, electrokinesis, upper_body, dress_shirt, hair_flower, closed_mouth, pleated_skirt, smile, breasts, emblem, psychic | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, aiming_at_viewer, brown_sweater_vest, electrokinesis, incoming_attack, looking_at_viewer, outstretched_arm, psychic, school_emblem, short_sleeves, solo, summer_uniform, tokiwadai_school_uniform, upper_body, holding_coin, smile, white_shirt, science_fiction, sleeveless_sweater, breasts, brown_vest, one_eye_closed | | 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, electrokinesis, shorts_under_skirt, solo, summer_uniform, sweater_vest, tokiwadai_school_uniform | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, electrokinesis, loafers, loose_socks, solo, summer_uniform, tokiwadai_school_uniform, brown_sweater_vest, full_body, open_mouth, short_shorts, shorts_under_skirt, simple_background, psychic, grey_skirt, looking_at_viewer, pleated_skirt, shirt, short_sleeves, smile, white_background, white_socks | | 4 | 8 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, blazer, brown_jacket, hair_flower, long_sleeves, looking_at_viewer, plaid_skirt, pleated_skirt, red_bowtie, solo, tokiwadai_school_uniform, white_shirt, winter_uniform, blush, closed_mouth, collared_shirt, blue_skirt, smile, white_background, white_flower, cowboy_shot, simple_background, hair_between_eyes, miniskirt, standing, shorts_under_skirt, twitter_username | | 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, hairpin, solo, tokiwadai_school_uniform, blazer, winter_uniform, plaid_skirt, blush, bow, coin | | 6 | 8 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, gym_shirt, gym_uniform, sleeveless_shirt, solo, white_shirt, bare_shoulders, looking_at_viewer, school_emblem, small_breasts, bare_arms, gym_shorts, blush, hair_flower, short_shorts, white_background, white_shorts, closed_mouth, open_mouth, simple_background, :d, cowboy_shot, full_body, shoes, white_flower | | 7 | 7 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, frilled_bikini, polka_dot_bikini, solo, bikini_skirt, navel, blush, day, outdoors, water, beach, flower | | 8 | 11 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, enmaided, solo, blush, maid_headdress, maid_apron, black_dress, frilled_apron, thighhighs, white_apron, bow, looking_at_viewer, ribbon, short_sleeves, simple_background, white_background, zettai_ryouiki, detached_collar, open_mouth, puffy_sleeves | | 9 | 9 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, kimono, solo, hair_flower, looking_at_viewer, smile, wide_sleeves, blush, long_sleeves, obi, closed_mouth, white_background, happy_new_year, simple_background, upper_body | | 10 | 5 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | 1girl, blush, hair_flower, halterneck, looking_at_viewer, navel, small_breasts, solo, twitter_username, collarbone, bare_shoulders, grin, orange_bikini, string_bikini, white_flower, long_sleeves, open_jacket, side-tie_bikini_bottom, stomach, upper_body, white_background, white_jacket | | 11 | 7 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | 1girl, blush, competition_swimsuit, looking_at_viewer, solo, multicolored_swimsuit, black_one-piece_swimsuit, blue_sky, cloud, collarbone, day, outdoors, small_breasts, hairclip, open_mouth, pool, sitting, smile | | 12 | 5 | ![](samples/12/clu12-sample0.png) | ![](samples/12/clu12-sample1.png) | ![](samples/12/clu12-sample2.png) | ![](samples/12/clu12-sample3.png) | ![](samples/12/clu12-sample4.png) | 1girl, solo, sundress, day, open_mouth, smile, cloud, fang, sky, barefoot, downblouse, hair_flower, hairpin | | 13 | 6 | ![](samples/13/clu13-sample0.png) | ![](samples/13/clu13-sample1.png) | ![](samples/13/clu13-sample2.png) | ![](samples/13/clu13-sample3.png) | ![](samples/13/clu13-sample4.png) | 1girl, christmas, santa_costume, santa_hat, solo, blush, thighhighs, bare_shoulders | | 14 | 6 | ![](samples/14/clu14-sample0.png) | ![](samples/14/clu14-sample1.png) | ![](samples/14/clu14-sample2.png) | ![](samples/14/clu14-sample3.png) | ![](samples/14/clu14-sample4.png) | 1girl, casual, baseball_cap, black_shirt, heart_print, looking_at_viewer, solo, short_shorts, short_sleeves, electrokinesis, ponytail, psychic | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | summer_uniform | tokiwadai_school_uniform | white_shirt | looking_at_viewer | brown_sweater_vest | short_sleeves | simple_background | white_background | v-neck | collared_shirt | blush | electrokinesis | upper_body | dress_shirt | hair_flower | closed_mouth | pleated_skirt | smile | breasts | emblem | psychic | aiming_at_viewer | incoming_attack | outstretched_arm | school_emblem | holding_coin | science_fiction | sleeveless_sweater | brown_vest | one_eye_closed | shorts_under_skirt | sweater_vest | loafers | loose_socks | full_body | open_mouth | short_shorts | grey_skirt | shirt | white_socks | blazer | brown_jacket | long_sleeves | plaid_skirt | red_bowtie | winter_uniform | blue_skirt | white_flower | cowboy_shot | hair_between_eyes | miniskirt | standing | twitter_username | hairpin | bow | coin | gym_shirt | gym_uniform | sleeveless_shirt | bare_shoulders | small_breasts | bare_arms | gym_shorts | white_shorts | :d | shoes | frilled_bikini | polka_dot_bikini | bikini_skirt | navel | day | outdoors | water | beach | flower | enmaided | maid_headdress | maid_apron | black_dress | frilled_apron | thighhighs | white_apron | ribbon | zettai_ryouiki | detached_collar | puffy_sleeves | kimono | wide_sleeves | obi | happy_new_year | halterneck | collarbone | grin | orange_bikini | string_bikini | open_jacket | side-tie_bikini_bottom | stomach | white_jacket | competition_swimsuit | multicolored_swimsuit | black_one-piece_swimsuit | blue_sky | cloud | hairclip | pool | sitting | sundress | fang | sky | barefoot | downblouse | christmas | santa_costume | santa_hat | casual | baseball_cap | black_shirt | heart_print | ponytail | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-------|:-----------------|:---------------------------|:--------------|:--------------------|:---------------------|:----------------|:--------------------|:-------------------|:---------|:-----------------|:--------|:-----------------|:-------------|:--------------|:--------------|:---------------|:----------------|:--------|:----------|:---------|:----------|:-------------------|:------------------|:-------------------|:----------------|:---------------|:------------------|:---------------------|:-------------|:-----------------|:---------------------|:---------------|:----------|:--------------|:------------|:-------------|:---------------|:-------------|:--------|:--------------|:---------|:---------------|:---------------|:--------------|:-------------|:-----------------|:-------------|:---------------|:--------------|:--------------------|:------------|:-----------|:-------------------|:----------|:------|:-------|:------------|:--------------|:-------------------|:-----------------|:----------------|:------------|:-------------|:---------------|:-----|:--------|:-----------------|:-------------------|:---------------|:--------|:------|:-----------|:--------|:--------|:---------|:-----------|:-----------------|:-------------|:--------------|:----------------|:-------------|:--------------|:---------|:-----------------|:------------------|:----------------|:---------|:---------------|:------|:-----------------|:-------------|:-------------|:-------|:----------------|:----------------|:--------------|:-------------------------|:----------|:---------------|:-----------------------|:------------------------|:---------------------------|:-----------|:--------|:-----------|:-------|:----------|:-----------|:-------|:------|:-----------|:-------------|:------------|:----------------|:------------|:---------|:---------------|:--------------|:--------------|:-----------| | 0 | 23 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | X | X | | | | | | X | X | | | | | X | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | X | | X | X | X | X | X | | | | X | | | | | X | X | | | X | | | | | | | | | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 8 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | | X | X | X | | | X | X | | X | X | | | | X | X | X | X | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | X | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | X | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 8 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | | | X | X | | | X | X | | | X | | | | X | X | | | | | | | | | X | | | | | | | | | | X | X | X | | | | | | | | | | | X | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 7 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 8 | 11 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | X | | | | X | | X | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 9 | 9 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | X | | | | X | | | X | X | | | X | | X | | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 10 | 5 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | X | X | | | | X | | | | X | | | X | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | | | | | X | | | | | | | X | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | 11 | 7 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | X | X | | | | X | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | 12 | 5 | ![](samples/12/clu12-sample0.png) | ![](samples/12/clu12-sample1.png) | ![](samples/12/clu12-sample2.png) | ![](samples/12/clu12-sample3.png) | ![](samples/12/clu12-sample4.png) | X | X | | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | X | X | X | X | | | | | | | | | | 13 | 6 | ![](samples/13/clu13-sample0.png) | ![](samples/13/clu13-sample1.png) | ![](samples/13/clu13-sample2.png) | ![](samples/13/clu13-sample3.png) | ![](samples/13/clu13-sample4.png) | X | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | 14 | 6 | ![](samples/14/clu14-sample0.png) | ![](samples/14/clu14-sample1.png) | ![](samples/14/clu14-sample2.png) | ![](samples/14/clu14-sample3.png) | ![](samples/14/clu14-sample4.png) | X | X | | | | X | | X | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X |
open-llm-leaderboard/details_h4rz3rk4s3__TinyPoliticaLlama-1.1B
--- pretty_name: Evaluation run of h4rz3rk4s3/TinyPoliticaLlama-1.1B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [h4rz3rk4s3/TinyPoliticaLlama-1.1B](https://huggingface.co/h4rz3rk4s3/TinyPoliticaLlama-1.1B)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_h4rz3rk4s3__TinyPoliticaLlama-1.1B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-03-09T20:58:23.188763](https://huggingface.co/datasets/open-llm-leaderboard/details_h4rz3rk4s3__TinyPoliticaLlama-1.1B/blob/main/results_2024-03-09T20-58-23.188763.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2592181333001868,\n\ \ \"acc_stderr\": 0.030987518923392604,\n \"acc_norm\": 0.26142713275875756,\n\ \ \"acc_norm_stderr\": 0.031810688522865525,\n \"mc1\": 0.21542227662178703,\n\ \ \"mc1_stderr\": 0.014391902652427688,\n \"mc2\": 0.3805985213573371,\n\ \ \"mc2_stderr\": 0.01395025708087029\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.295221843003413,\n \"acc_stderr\": 0.013329750293382316,\n\ \ \"acc_norm\": 0.3378839590443686,\n \"acc_norm_stderr\": 0.013822047922283514\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4320852419836686,\n\ \ \"acc_stderr\": 0.0049435372423444176,\n \"acc_norm\": 0.5782712607050389,\n\ \ \"acc_norm_stderr\": 0.004928263494616739\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.17777777777777778,\n\ \ \"acc_stderr\": 0.033027898599017176,\n \"acc_norm\": 0.17777777777777778,\n\ \ \"acc_norm_stderr\": 0.033027898599017176\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123408,\n\ \ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123408\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\ \ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \ \ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.2528301886792453,\n \"acc_stderr\": 0.02674989977124123,\n\ \ \"acc_norm\": 0.2528301886792453,\n \"acc_norm_stderr\": 0.02674989977124123\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2777777777777778,\n\ \ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.2777777777777778,\n\ \ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \ \ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n\ \ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n\ \ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n\ \ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.044405219061793275,\n\ \ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.044405219061793275\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\ \ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.2,\n \"acc_stderr\": 0.026148818018424495,\n \ \ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.026148818018424495\n \ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.20175438596491227,\n\ \ \"acc_stderr\": 0.037752050135836386,\n \"acc_norm\": 0.20175438596491227,\n\ \ \"acc_norm_stderr\": 0.037752050135836386\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707842,\n\ \ \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707842\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.24867724867724866,\n \"acc_stderr\": 0.022261817692400168,\n \"\ acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.022261817692400168\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\ \ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\ \ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1935483870967742,\n\ \ \"acc_stderr\": 0.02247525852553606,\n \"acc_norm\": 0.1935483870967742,\n\ \ \"acc_norm_stderr\": 0.02247525852553606\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.18719211822660098,\n \"acc_stderr\": 0.027444924966882618,\n\ \ \"acc_norm\": 0.18719211822660098,\n \"acc_norm_stderr\": 0.027444924966882618\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\"\ : 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n\ \ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.18686868686868688,\n \"acc_stderr\": 0.027772533334218977,\n \"\ acc_norm\": 0.18686868686868688,\n \"acc_norm_stderr\": 0.027772533334218977\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860657,\n\ \ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860657\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.2948717948717949,\n \"acc_stderr\": 0.02311936275823229,\n \ \ \"acc_norm\": 0.2948717948717949,\n \"acc_norm_stderr\": 0.02311936275823229\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230193,\n \ \ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230193\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.24789915966386555,\n \"acc_stderr\": 0.028047967224176892,\n\ \ \"acc_norm\": 0.24789915966386555,\n \"acc_norm_stderr\": 0.028047967224176892\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.24503311258278146,\n \"acc_stderr\": 0.035118075718047245,\n \"\ acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.035118075718047245\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.2018348623853211,\n \"acc_stderr\": 0.01720857935778757,\n \"\ acc_norm\": 0.2018348623853211,\n \"acc_norm_stderr\": 0.01720857935778757\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.28703703703703703,\n \"acc_stderr\": 0.030851992993257017,\n \"\ acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.030851992993257017\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\ \ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\ : {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\ \ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3452914798206278,\n\ \ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.3452914798206278,\n\ \ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.29770992366412213,\n \"acc_stderr\": 0.04010358942462203,\n\ \ \"acc_norm\": 0.29770992366412213,\n \"acc_norm_stderr\": 0.04010358942462203\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\ acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\ \ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \ \ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\ \ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\ \ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\ \ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.03916667762822585,\n\ \ \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.03916667762822585\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2863247863247863,\n\ \ \"acc_stderr\": 0.02961432369045665,\n \"acc_norm\": 0.2863247863247863,\n\ \ \"acc_norm_stderr\": 0.02961432369045665\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24904214559386972,\n\ \ \"acc_stderr\": 0.015464676163395977,\n \"acc_norm\": 0.24904214559386972,\n\ \ \"acc_norm_stderr\": 0.015464676163395977\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.2514450867052023,\n \"acc_stderr\": 0.02335736578587404,\n\ \ \"acc_norm\": 0.2514450867052023,\n \"acc_norm_stderr\": 0.02335736578587404\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\ \ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\ \ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\ \ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24115755627009647,\n\ \ \"acc_stderr\": 0.024296594034763426,\n \"acc_norm\": 0.24115755627009647,\n\ \ \"acc_norm_stderr\": 0.024296594034763426\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.2345679012345679,\n \"acc_stderr\": 0.023576881744005716,\n\ \ \"acc_norm\": 0.2345679012345679,\n \"acc_norm_stderr\": 0.023576881744005716\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.21631205673758866,\n \"acc_stderr\": 0.024561720560562796,\n \ \ \"acc_norm\": 0.21631205673758866,\n \"acc_norm_stderr\": 0.024561720560562796\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2470664928292047,\n\ \ \"acc_stderr\": 0.011015752255279341,\n \"acc_norm\": 0.2470664928292047,\n\ \ \"acc_norm_stderr\": 0.011015752255279341\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.024398192986654924,\n\ \ \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.024398192986654924\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.23529411764705882,\n \"acc_stderr\": 0.01716058723504634,\n \ \ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.01716058723504634\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\ \ \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n\ \ \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.19591836734693877,\n \"acc_stderr\": 0.025409301953225678,\n\ \ \"acc_norm\": 0.19591836734693877,\n \"acc_norm_stderr\": 0.025409301953225678\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.27860696517412936,\n\ \ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.27860696517412936,\n\ \ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.29518072289156627,\n\ \ \"acc_stderr\": 0.035509201856896294,\n \"acc_norm\": 0.29518072289156627,\n\ \ \"acc_norm_stderr\": 0.035509201856896294\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n\ \ \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21542227662178703,\n\ \ \"mc1_stderr\": 0.014391902652427688,\n \"mc2\": 0.3805985213573371,\n\ \ \"mc2_stderr\": 0.01395025708087029\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.5769534333070244,\n \"acc_stderr\": 0.013885055359056476\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\ : 0.0\n }\n}\n```" repo_url: https://huggingface.co/h4rz3rk4s3/TinyPoliticaLlama-1.1B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|arc:challenge|25_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-03-09T20-58-23.188763.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|gsm8k|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hellaswag|10_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-58-23.188763.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-management|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-58-23.188763.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|truthfulqa:mc|0_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-03-09T20-58-23.188763.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_03_09T20_58_23.188763 path: - '**/details_harness|winogrande|5_2024-03-09T20-58-23.188763.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-03-09T20-58-23.188763.parquet' - config_name: results data_files: - split: 2024_03_09T20_58_23.188763 path: - results_2024-03-09T20-58-23.188763.parquet - split: latest path: - results_2024-03-09T20-58-23.188763.parquet --- # Dataset Card for Evaluation run of h4rz3rk4s3/TinyPoliticaLlama-1.1B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [h4rz3rk4s3/TinyPoliticaLlama-1.1B](https://huggingface.co/h4rz3rk4s3/TinyPoliticaLlama-1.1B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_h4rz3rk4s3__TinyPoliticaLlama-1.1B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-03-09T20:58:23.188763](https://huggingface.co/datasets/open-llm-leaderboard/details_h4rz3rk4s3__TinyPoliticaLlama-1.1B/blob/main/results_2024-03-09T20-58-23.188763.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.2592181333001868, "acc_stderr": 0.030987518923392604, "acc_norm": 0.26142713275875756, "acc_norm_stderr": 0.031810688522865525, "mc1": 0.21542227662178703, "mc1_stderr": 0.014391902652427688, "mc2": 0.3805985213573371, "mc2_stderr": 0.01395025708087029 }, "harness|arc:challenge|25": { "acc": 0.295221843003413, "acc_stderr": 0.013329750293382316, "acc_norm": 0.3378839590443686, "acc_norm_stderr": 0.013822047922283514 }, "harness|hellaswag|10": { "acc": 0.4320852419836686, "acc_stderr": 0.0049435372423444176, "acc_norm": 0.5782712607050389, "acc_norm_stderr": 0.004928263494616739 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.17777777777777778, "acc_stderr": 0.033027898599017176, "acc_norm": 0.17777777777777778, "acc_norm_stderr": 0.033027898599017176 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.17763157894736842, "acc_stderr": 0.031103182383123408, "acc_norm": 0.17763157894736842, "acc_norm_stderr": 0.031103182383123408 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.2528301886792453, "acc_stderr": 0.02674989977124123, "acc_norm": 0.2528301886792453, "acc_norm_stderr": 0.02674989977124123 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2777777777777778, "acc_stderr": 0.037455547914624555, "acc_norm": 0.2777777777777778, "acc_norm_stderr": 0.037455547914624555 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.28, "acc_stderr": 0.04512608598542127, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.2543352601156069, "acc_stderr": 0.0332055644308557, "acc_norm": 0.2543352601156069, "acc_norm_stderr": 0.0332055644308557 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.27450980392156865, "acc_stderr": 0.044405219061793275, "acc_norm": 0.27450980392156865, "acc_norm_stderr": 0.044405219061793275 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.2, "acc_stderr": 0.026148818018424495, "acc_norm": 0.2, "acc_norm_stderr": 0.026148818018424495 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.20175438596491227, "acc_stderr": 0.037752050135836386, "acc_norm": 0.20175438596491227, "acc_norm_stderr": 0.037752050135836386 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.25517241379310346, "acc_stderr": 0.03632984052707842, "acc_norm": 0.25517241379310346, "acc_norm_stderr": 0.03632984052707842 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.24867724867724866, "acc_stderr": 0.022261817692400168, "acc_norm": 0.24867724867724866, "acc_norm_stderr": 0.022261817692400168 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.30952380952380953, "acc_stderr": 0.04134913018303316, "acc_norm": 0.30952380952380953, "acc_norm_stderr": 0.04134913018303316 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.04688261722621505, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.1935483870967742, "acc_stderr": 0.02247525852553606, "acc_norm": 0.1935483870967742, "acc_norm_stderr": 0.02247525852553606 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.18719211822660098, "acc_stderr": 0.027444924966882618, "acc_norm": 0.18719211822660098, "acc_norm_stderr": 0.027444924966882618 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.27, "acc_stderr": 0.0446196043338474, "acc_norm": 0.27, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.24242424242424243, "acc_stderr": 0.03346409881055953, "acc_norm": 0.24242424242424243, "acc_norm_stderr": 0.03346409881055953 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.18686868686868688, "acc_stderr": 0.027772533334218977, "acc_norm": 0.18686868686868688, "acc_norm_stderr": 0.027772533334218977 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.19689119170984457, "acc_stderr": 0.028697873971860657, "acc_norm": 0.19689119170984457, "acc_norm_stderr": 0.028697873971860657 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.2948717948717949, "acc_stderr": 0.02311936275823229, "acc_norm": 0.2948717948717949, "acc_norm_stderr": 0.02311936275823229 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2777777777777778, "acc_stderr": 0.027309140588230193, "acc_norm": 0.2777777777777778, "acc_norm_stderr": 0.027309140588230193 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.24789915966386555, "acc_stderr": 0.028047967224176892, "acc_norm": 0.24789915966386555, "acc_norm_stderr": 0.028047967224176892 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.24503311258278146, "acc_stderr": 0.035118075718047245, "acc_norm": 0.24503311258278146, "acc_norm_stderr": 0.035118075718047245 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.2018348623853211, "acc_stderr": 0.01720857935778757, "acc_norm": 0.2018348623853211, "acc_norm_stderr": 0.01720857935778757 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.28703703703703703, "acc_stderr": 0.030851992993257017, "acc_norm": 0.28703703703703703, "acc_norm_stderr": 0.030851992993257017 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.25, "acc_stderr": 0.03039153369274154, "acc_norm": 0.25, "acc_norm_stderr": 0.03039153369274154 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.270042194092827, "acc_stderr": 0.028900721906293426, "acc_norm": 0.270042194092827, "acc_norm_stderr": 0.028900721906293426 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.3452914798206278, "acc_stderr": 0.03191100192835794, "acc_norm": 0.3452914798206278, "acc_norm_stderr": 0.03191100192835794 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.29770992366412213, "acc_stderr": 0.04010358942462203, "acc_norm": 0.29770992366412213, "acc_norm_stderr": 0.04010358942462203 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2396694214876033, "acc_stderr": 0.03896878985070417, "acc_norm": 0.2396694214876033, "acc_norm_stderr": 0.03896878985070417 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.25, "acc_stderr": 0.04186091791394607, "acc_norm": 0.25, "acc_norm_stderr": 0.04186091791394607 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.22085889570552147, "acc_stderr": 0.032591773927421776, "acc_norm": 0.22085889570552147, "acc_norm_stderr": 0.032591773927421776 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.30357142857142855, "acc_stderr": 0.04364226155841044, "acc_norm": 0.30357142857142855, "acc_norm_stderr": 0.04364226155841044 }, "harness|hendrycksTest-management|5": { "acc": 0.1941747572815534, "acc_stderr": 0.03916667762822585, "acc_norm": 0.1941747572815534, "acc_norm_stderr": 0.03916667762822585 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2863247863247863, "acc_stderr": 0.02961432369045665, "acc_norm": 0.2863247863247863, "acc_norm_stderr": 0.02961432369045665 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.35, "acc_stderr": 0.04793724854411019, "acc_norm": 0.35, "acc_norm_stderr": 0.04793724854411019 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.24904214559386972, "acc_stderr": 0.015464676163395977, "acc_norm": 0.24904214559386972, "acc_norm_stderr": 0.015464676163395977 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.2514450867052023, "acc_stderr": 0.02335736578587404, "acc_norm": 0.2514450867052023, "acc_norm_stderr": 0.02335736578587404 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24692737430167597, "acc_stderr": 0.014422292204808835, "acc_norm": 0.24692737430167597, "acc_norm_stderr": 0.014422292204808835 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.22549019607843138, "acc_stderr": 0.023929155517351284, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.023929155517351284 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.24115755627009647, "acc_stderr": 0.024296594034763426, "acc_norm": 0.24115755627009647, "acc_norm_stderr": 0.024296594034763426 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.2345679012345679, "acc_stderr": 0.023576881744005716, "acc_norm": 0.2345679012345679, "acc_norm_stderr": 0.023576881744005716 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.21631205673758866, "acc_stderr": 0.024561720560562796, "acc_norm": 0.21631205673758866, "acc_norm_stderr": 0.024561720560562796 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2470664928292047, "acc_stderr": 0.011015752255279341, "acc_norm": 0.2470664928292047, "acc_norm_stderr": 0.011015752255279341 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.20220588235294118, "acc_stderr": 0.024398192986654924, "acc_norm": 0.20220588235294118, "acc_norm_stderr": 0.024398192986654924 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.23529411764705882, "acc_stderr": 0.01716058723504634, "acc_norm": 0.23529411764705882, "acc_norm_stderr": 0.01716058723504634 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03955932861795833, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03955932861795833 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.19591836734693877, "acc_stderr": 0.025409301953225678, "acc_norm": 0.19591836734693877, "acc_norm_stderr": 0.025409301953225678 }, "harness|hendrycksTest-sociology|5": { "acc": 0.27860696517412936, "acc_stderr": 0.031700561834973086, "acc_norm": 0.27860696517412936, "acc_norm_stderr": 0.031700561834973086 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-virology|5": { "acc": 0.29518072289156627, "acc_stderr": 0.035509201856896294, "acc_norm": 0.29518072289156627, "acc_norm_stderr": 0.035509201856896294 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.3216374269005848, "acc_stderr": 0.03582529442573122, "acc_norm": 0.3216374269005848, "acc_norm_stderr": 0.03582529442573122 }, "harness|truthfulqa:mc|0": { "mc1": 0.21542227662178703, "mc1_stderr": 0.014391902652427688, "mc2": 0.3805985213573371, "mc2_stderr": 0.01395025708087029 }, "harness|winogrande|5": { "acc": 0.5769534333070244, "acc_stderr": 0.013885055359056476 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
liuyanchen1015/MULTI_VALUE_stsb_invariant_tag_can_or_not
--- dataset_info: features: - name: sentence1 dtype: string - name: sentence2 dtype: string - name: score dtype: float64 - name: idx dtype: int64 - name: value_score dtype: int64 splits: - name: dev num_bytes: 1438 num_examples: 7 - name: test num_bytes: 838 num_examples: 7 - name: train num_bytes: 4124 num_examples: 28 download_size: 12970 dataset_size: 6400 --- # Dataset Card for "MULTI_VALUE_stsb_invariant_tag_can_or_not" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
rmihiranga/english-text-fullfill-v1
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 2852865 num_examples: 2834 download_size: 1783867 dataset_size: 2852865 configs: - config_name: default data_files: - split: train path: data/train-* ---
open-llm-leaderboard/details_jisukim8873__falcon-7B-case-3
--- pretty_name: Evaluation run of jisukim8873/falcon-7B-case-3 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [jisukim8873/falcon-7B-case-3](https://huggingface.co/jisukim8873/falcon-7B-case-3)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jisukim8873__falcon-7B-case-3\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-02-19T02:19:19.586473](https://huggingface.co/datasets/open-llm-leaderboard/details_jisukim8873__falcon-7B-case-3/blob/main/results_2024-02-19T02-19-19.586473.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3282659962571978,\n\ \ \"acc_stderr\": 0.03298331789034154,\n \"acc_norm\": 0.3301115002349692,\n\ \ \"acc_norm_stderr\": 0.033761421405656425,\n \"mc1\": 0.24724602203182375,\n\ \ \"mc1_stderr\": 0.01510240479735965,\n \"mc2\": 0.36433197582570465,\n\ \ \"mc2_stderr\": 0.014190689156837067\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.44112627986348124,\n \"acc_stderr\": 0.014509747749064664,\n\ \ \"acc_norm\": 0.4778156996587031,\n \"acc_norm_stderr\": 0.014597001927076136\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5955984863572994,\n\ \ \"acc_stderr\": 0.004897728370737241,\n \"acc_norm\": 0.783011352320255,\n\ \ \"acc_norm_stderr\": 0.0041135241598451115\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\ \ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\ \ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.29605263157894735,\n \"acc_stderr\": 0.03715062154998905,\n\ \ \"acc_norm\": 0.29605263157894735,\n \"acc_norm_stderr\": 0.03715062154998905\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n\ \ \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n \ \ \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.35094339622641507,\n \"acc_stderr\": 0.029373646253234686,\n\ \ \"acc_norm\": 0.35094339622641507,\n \"acc_norm_stderr\": 0.029373646253234686\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\ \ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\ \ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653697,\n \ \ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653697\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\ \ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \ \ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.35260115606936415,\n\ \ \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.35260115606936415,\n\ \ \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617748,\n\ \ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617748\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.42,\n\ \ \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.35319148936170214,\n \"acc_stderr\": 0.031245325202761926,\n\ \ \"acc_norm\": 0.35319148936170214,\n \"acc_norm_stderr\": 0.031245325202761926\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\ \ \"acc_stderr\": 0.042663394431593935,\n \"acc_norm\": 0.2894736842105263,\n\ \ \"acc_norm_stderr\": 0.042663394431593935\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.03806142687309994,\n\ \ \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.03806142687309994\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.24074074074074073,\n \"acc_stderr\": 0.0220190800122179,\n \"\ acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.0220190800122179\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\ \ \"acc_stderr\": 0.036196045241242515,\n \"acc_norm\": 0.20634920634920634,\n\ \ \"acc_norm_stderr\": 0.036196045241242515\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.33548387096774196,\n \"acc_stderr\": 0.026860206444724356,\n \"\ acc_norm\": 0.33548387096774196,\n \"acc_norm_stderr\": 0.026860206444724356\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.30049261083743845,\n \"acc_stderr\": 0.03225799476233484,\n \"\ acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.03225799476233484\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\"\ : 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.296969696969697,\n \"acc_stderr\": 0.03567969772268048,\n\ \ \"acc_norm\": 0.296969696969697,\n \"acc_norm_stderr\": 0.03567969772268048\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.3939393939393939,\n \"acc_stderr\": 0.03481285338232963,\n \"\ acc_norm\": 0.3939393939393939,\n \"acc_norm_stderr\": 0.03481285338232963\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.35233160621761656,\n \"acc_stderr\": 0.03447478286414357,\n\ \ \"acc_norm\": 0.35233160621761656,\n \"acc_norm_stderr\": 0.03447478286414357\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.30512820512820515,\n \"acc_stderr\": 0.023346335293325887,\n\ \ \"acc_norm\": 0.30512820512820515,\n \"acc_norm_stderr\": 0.023346335293325887\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \ \ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.3067226890756303,\n \"acc_stderr\": 0.02995382389188704,\n \ \ \"acc_norm\": 0.3067226890756303,\n \"acc_norm_stderr\": 0.02995382389188704\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\ acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.3192660550458716,\n \"acc_stderr\": 0.019987829069750006,\n \"\ acc_norm\": 0.3192660550458716,\n \"acc_norm_stderr\": 0.019987829069750006\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.18518518518518517,\n \"acc_stderr\": 0.026491914727355157,\n \"\ acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.026491914727355157\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.3235294117647059,\n \"acc_stderr\": 0.03283472056108567,\n \"\ acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.03283472056108567\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.3628691983122363,\n \"acc_stderr\": 0.03129920825530213,\n \ \ \"acc_norm\": 0.3628691983122363,\n \"acc_norm_stderr\": 0.03129920825530213\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3721973094170404,\n\ \ \"acc_stderr\": 0.032443052830087304,\n \"acc_norm\": 0.3721973094170404,\n\ \ \"acc_norm_stderr\": 0.032443052830087304\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.3511450381679389,\n \"acc_stderr\": 0.041864451630137495,\n\ \ \"acc_norm\": 0.3511450381679389,\n \"acc_norm_stderr\": 0.041864451630137495\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.38016528925619836,\n \"acc_stderr\": 0.04431324501968432,\n \"\ acc_norm\": 0.38016528925619836,\n \"acc_norm_stderr\": 0.04431324501968432\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3333333333333333,\n\ \ \"acc_stderr\": 0.04557239513497752,\n \"acc_norm\": 0.3333333333333333,\n\ \ \"acc_norm_stderr\": 0.04557239513497752\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.3067484662576687,\n \"acc_stderr\": 0.036230899157241474,\n\ \ \"acc_norm\": 0.3067484662576687,\n \"acc_norm_stderr\": 0.036230899157241474\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\ \ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\ \ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.30097087378640774,\n \"acc_stderr\": 0.045416094465039476,\n\ \ \"acc_norm\": 0.30097087378640774,\n \"acc_norm_stderr\": 0.045416094465039476\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.4017094017094017,\n\ \ \"acc_stderr\": 0.032116937510516204,\n \"acc_norm\": 0.4017094017094017,\n\ \ \"acc_norm_stderr\": 0.032116937510516204\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \ \ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.40485312899106,\n\ \ \"acc_stderr\": 0.017553246467720263,\n \"acc_norm\": 0.40485312899106,\n\ \ \"acc_norm_stderr\": 0.017553246467720263\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.33815028901734107,\n \"acc_stderr\": 0.025469770149400175,\n\ \ \"acc_norm\": 0.33815028901734107,\n \"acc_norm_stderr\": 0.025469770149400175\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25027932960893856,\n\ \ \"acc_stderr\": 0.014487500852850409,\n \"acc_norm\": 0.25027932960893856,\n\ \ \"acc_norm_stderr\": 0.014487500852850409\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.026787453111906532,\n\ \ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.026787453111906532\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3279742765273312,\n\ \ \"acc_stderr\": 0.026664410886937613,\n \"acc_norm\": 0.3279742765273312,\n\ \ \"acc_norm_stderr\": 0.026664410886937613\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02584224870090217,\n\ \ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02584224870090217\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.2553191489361702,\n \"acc_stderr\": 0.02601199293090202,\n \ \ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.02601199293090202\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26597131681877445,\n\ \ \"acc_stderr\": 0.011285033165551288,\n \"acc_norm\": 0.26597131681877445,\n\ \ \"acc_norm_stderr\": 0.011285033165551288\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.39705882352941174,\n \"acc_stderr\": 0.02972215209928006,\n\ \ \"acc_norm\": 0.39705882352941174,\n \"acc_norm_stderr\": 0.02972215209928006\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.2875816993464052,\n \"acc_stderr\": 0.018311653053648222,\n \ \ \"acc_norm\": 0.2875816993464052,\n \"acc_norm_stderr\": 0.018311653053648222\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3090909090909091,\n\ \ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.3090909090909091,\n\ \ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.35918367346938773,\n \"acc_stderr\": 0.03071356045510849,\n\ \ \"acc_norm\": 0.35918367346938773,\n \"acc_norm_stderr\": 0.03071356045510849\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3681592039800995,\n\ \ \"acc_stderr\": 0.03410410565495301,\n \"acc_norm\": 0.3681592039800995,\n\ \ \"acc_norm_stderr\": 0.03410410565495301\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \ \ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3313253012048193,\n\ \ \"acc_stderr\": 0.036643147772880864,\n \"acc_norm\": 0.3313253012048193,\n\ \ \"acc_norm_stderr\": 0.036643147772880864\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.4152046783625731,\n \"acc_stderr\": 0.03779275945503201,\n\ \ \"acc_norm\": 0.4152046783625731,\n \"acc_norm_stderr\": 0.03779275945503201\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24724602203182375,\n\ \ \"mc1_stderr\": 0.01510240479735965,\n \"mc2\": 0.36433197582570465,\n\ \ \"mc2_stderr\": 0.014190689156837067\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7103393843725335,\n \"acc_stderr\": 0.012748550807638256\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06141015921152388,\n \ \ \"acc_stderr\": 0.006613027536586315\n }\n}\n```" repo_url: https://huggingface.co/jisukim8873/falcon-7B-case-3 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|arc:challenge|25_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-02-19T02-19-19.586473.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|gsm8k|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hellaswag|10_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-19T02-19-19.586473.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-management|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-virology|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T02-19-19.586473.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|truthfulqa:mc|0_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-02-19T02-19-19.586473.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_02_19T02_19_19.586473 path: - '**/details_harness|winogrande|5_2024-02-19T02-19-19.586473.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-02-19T02-19-19.586473.parquet' - config_name: results data_files: - split: 2024_02_19T02_19_19.586473 path: - results_2024-02-19T02-19-19.586473.parquet - split: latest path: - results_2024-02-19T02-19-19.586473.parquet --- # Dataset Card for Evaluation run of jisukim8873/falcon-7B-case-3 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [jisukim8873/falcon-7B-case-3](https://huggingface.co/jisukim8873/falcon-7B-case-3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_jisukim8873__falcon-7B-case-3", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-19T02:19:19.586473](https://huggingface.co/datasets/open-llm-leaderboard/details_jisukim8873__falcon-7B-case-3/blob/main/results_2024-02-19T02-19-19.586473.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.3282659962571978, "acc_stderr": 0.03298331789034154, "acc_norm": 0.3301115002349692, "acc_norm_stderr": 0.033761421405656425, "mc1": 0.24724602203182375, "mc1_stderr": 0.01510240479735965, "mc2": 0.36433197582570465, "mc2_stderr": 0.014190689156837067 }, "harness|arc:challenge|25": { "acc": 0.44112627986348124, "acc_stderr": 0.014509747749064664, "acc_norm": 0.4778156996587031, "acc_norm_stderr": 0.014597001927076136 }, "harness|hellaswag|10": { "acc": 0.5955984863572994, "acc_stderr": 0.004897728370737241, "acc_norm": 0.783011352320255, "acc_norm_stderr": 0.0041135241598451115 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.3333333333333333, "acc_stderr": 0.04072314811876837, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.04072314811876837 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.29605263157894735, "acc_stderr": 0.03715062154998905, "acc_norm": 0.29605263157894735, "acc_norm_stderr": 0.03715062154998905 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.22, "acc_stderr": 0.0416333199893227, "acc_norm": 0.22, "acc_norm_stderr": 0.0416333199893227 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.35094339622641507, "acc_stderr": 0.029373646253234686, "acc_norm": 0.35094339622641507, "acc_norm_stderr": 0.029373646253234686 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2638888888888889, "acc_stderr": 0.03685651095897532, "acc_norm": 0.2638888888888889, "acc_norm_stderr": 0.03685651095897532 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.18, "acc_stderr": 0.03861229196653697, "acc_norm": 0.18, "acc_norm_stderr": 0.03861229196653697 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.28, "acc_stderr": 0.04512608598542127, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.35260115606936415, "acc_stderr": 0.03643037168958548, "acc_norm": 0.35260115606936415, "acc_norm_stderr": 0.03643037168958548 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.20588235294117646, "acc_stderr": 0.04023382273617748, "acc_norm": 0.20588235294117646, "acc_norm_stderr": 0.04023382273617748 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.42, "acc_stderr": 0.04960449637488584, "acc_norm": 0.42, "acc_norm_stderr": 0.04960449637488584 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.35319148936170214, "acc_stderr": 0.031245325202761926, "acc_norm": 0.35319148936170214, "acc_norm_stderr": 0.031245325202761926 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2894736842105263, "acc_stderr": 0.042663394431593935, "acc_norm": 0.2894736842105263, "acc_norm_stderr": 0.042663394431593935 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.296551724137931, "acc_stderr": 0.03806142687309994, "acc_norm": 0.296551724137931, "acc_norm_stderr": 0.03806142687309994 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.24074074074074073, "acc_stderr": 0.0220190800122179, "acc_norm": 0.24074074074074073, "acc_norm_stderr": 0.0220190800122179 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.20634920634920634, "acc_stderr": 0.036196045241242515, "acc_norm": 0.20634920634920634, "acc_norm_stderr": 0.036196045241242515 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.33548387096774196, "acc_stderr": 0.026860206444724356, "acc_norm": 0.33548387096774196, "acc_norm_stderr": 0.026860206444724356 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.30049261083743845, "acc_stderr": 0.03225799476233484, "acc_norm": 0.30049261083743845, "acc_norm_stderr": 0.03225799476233484 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.296969696969697, "acc_stderr": 0.03567969772268048, "acc_norm": 0.296969696969697, "acc_norm_stderr": 0.03567969772268048 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.3939393939393939, "acc_stderr": 0.03481285338232963, "acc_norm": 0.3939393939393939, "acc_norm_stderr": 0.03481285338232963 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.35233160621761656, "acc_stderr": 0.03447478286414357, "acc_norm": 0.35233160621761656, "acc_norm_stderr": 0.03447478286414357 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.30512820512820515, "acc_stderr": 0.023346335293325887, "acc_norm": 0.30512820512820515, "acc_norm_stderr": 0.023346335293325887 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.24814814814814815, "acc_stderr": 0.0263357394040558, "acc_norm": 0.24814814814814815, "acc_norm_stderr": 0.0263357394040558 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.3067226890756303, "acc_stderr": 0.02995382389188704, "acc_norm": 0.3067226890756303, "acc_norm_stderr": 0.02995382389188704 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2847682119205298, "acc_stderr": 0.03684881521389023, "acc_norm": 0.2847682119205298, "acc_norm_stderr": 0.03684881521389023 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.3192660550458716, "acc_stderr": 0.019987829069750006, "acc_norm": 0.3192660550458716, "acc_norm_stderr": 0.019987829069750006 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.18518518518518517, "acc_stderr": 0.026491914727355157, "acc_norm": 0.18518518518518517, "acc_norm_stderr": 0.026491914727355157 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.3235294117647059, "acc_stderr": 0.03283472056108567, "acc_norm": 0.3235294117647059, "acc_norm_stderr": 0.03283472056108567 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.3628691983122363, "acc_stderr": 0.03129920825530213, "acc_norm": 0.3628691983122363, "acc_norm_stderr": 0.03129920825530213 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.3721973094170404, "acc_stderr": 0.032443052830087304, "acc_norm": 0.3721973094170404, "acc_norm_stderr": 0.032443052830087304 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.3511450381679389, "acc_stderr": 0.041864451630137495, "acc_norm": 0.3511450381679389, "acc_norm_stderr": 0.041864451630137495 }, "harness|hendrycksTest-international_law|5": { "acc": 0.38016528925619836, "acc_stderr": 0.04431324501968432, "acc_norm": 0.38016528925619836, "acc_norm_stderr": 0.04431324501968432 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.3333333333333333, "acc_stderr": 0.04557239513497752, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.04557239513497752 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.3067484662576687, "acc_stderr": 0.036230899157241474, "acc_norm": 0.3067484662576687, "acc_norm_stderr": 0.036230899157241474 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.35714285714285715, "acc_stderr": 0.04547960999764376, "acc_norm": 0.35714285714285715, "acc_norm_stderr": 0.04547960999764376 }, "harness|hendrycksTest-management|5": { "acc": 0.30097087378640774, "acc_stderr": 0.045416094465039476, "acc_norm": 0.30097087378640774, "acc_norm_stderr": 0.045416094465039476 }, "harness|hendrycksTest-marketing|5": { "acc": 0.4017094017094017, "acc_stderr": 0.032116937510516204, "acc_norm": 0.4017094017094017, "acc_norm_stderr": 0.032116937510516204 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.4, "acc_stderr": 0.049236596391733084, "acc_norm": 0.4, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.40485312899106, "acc_stderr": 0.017553246467720263, "acc_norm": 0.40485312899106, "acc_norm_stderr": 0.017553246467720263 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.33815028901734107, "acc_stderr": 0.025469770149400175, "acc_norm": 0.33815028901734107, "acc_norm_stderr": 0.025469770149400175 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.25027932960893856, "acc_stderr": 0.014487500852850409, "acc_norm": 0.25027932960893856, "acc_norm_stderr": 0.014487500852850409 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.3235294117647059, "acc_stderr": 0.026787453111906532, "acc_norm": 0.3235294117647059, "acc_norm_stderr": 0.026787453111906532 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.3279742765273312, "acc_stderr": 0.026664410886937613, "acc_norm": 0.3279742765273312, "acc_norm_stderr": 0.026664410886937613 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.3148148148148148, "acc_stderr": 0.02584224870090217, "acc_norm": 0.3148148148148148, "acc_norm_stderr": 0.02584224870090217 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.2553191489361702, "acc_stderr": 0.02601199293090202, "acc_norm": 0.2553191489361702, "acc_norm_stderr": 0.02601199293090202 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.26597131681877445, "acc_stderr": 0.011285033165551288, "acc_norm": 0.26597131681877445, "acc_norm_stderr": 0.011285033165551288 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.39705882352941174, "acc_stderr": 0.02972215209928006, "acc_norm": 0.39705882352941174, "acc_norm_stderr": 0.02972215209928006 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.2875816993464052, "acc_stderr": 0.018311653053648222, "acc_norm": 0.2875816993464052, "acc_norm_stderr": 0.018311653053648222 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.3090909090909091, "acc_stderr": 0.044262946482000985, "acc_norm": 0.3090909090909091, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.35918367346938773, "acc_stderr": 0.03071356045510849, "acc_norm": 0.35918367346938773, "acc_norm_stderr": 0.03071356045510849 }, "harness|hendrycksTest-sociology|5": { "acc": 0.3681592039800995, "acc_stderr": 0.03410410565495301, "acc_norm": 0.3681592039800995, "acc_norm_stderr": 0.03410410565495301 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-virology|5": { "acc": 0.3313253012048193, "acc_stderr": 0.036643147772880864, "acc_norm": 0.3313253012048193, "acc_norm_stderr": 0.036643147772880864 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.4152046783625731, "acc_stderr": 0.03779275945503201, "acc_norm": 0.4152046783625731, "acc_norm_stderr": 0.03779275945503201 }, "harness|truthfulqa:mc|0": { "mc1": 0.24724602203182375, "mc1_stderr": 0.01510240479735965, "mc2": 0.36433197582570465, "mc2_stderr": 0.014190689156837067 }, "harness|winogrande|5": { "acc": 0.7103393843725335, "acc_stderr": 0.012748550807638256 }, "harness|gsm8k|5": { "acc": 0.06141015921152388, "acc_stderr": 0.006613027536586315 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
heliosprime/twitter_dataset_1713152930
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 3981 num_examples: 12 download_size: 8692 dataset_size: 3981 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "twitter_dataset_1713152930" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)